Apr 16 20:58:16.858251 ip-10-0-139-17 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:58:17.322598 ip-10-0-139-17 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:58:17.322598 ip-10-0-139-17 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:58:17.322598 ip-10-0-139-17 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:58:17.322598 ip-10-0-139-17 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:58:17.322598 ip-10-0-139-17 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:58:17.323310 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.323179 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:58:17.326897 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326863 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:17.326897 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326892 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:17.326897 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326899 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:17.326897 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326903 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326908 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326914 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326919 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326923 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326927 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326930 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326934 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326937 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326942 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326950 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326954 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326958 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326962 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326968 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326973 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326976 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326980 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.326985 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327007 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:17.327185 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327011 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327015 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327020 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327024 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327028 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327034 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327043 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327046 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327050 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327054 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327060 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327064 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327068 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327073 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327078 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327082 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327087 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327091 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327095 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327104 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:17.328028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327109 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327114 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327118 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327123 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327127 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327131 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327135 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327140 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327147 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327155 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327160 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327170 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327175 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327179 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327184 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327191 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327195 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327198 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327203 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:17.328784 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327207 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327211 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327216 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327220 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327252 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327275 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327393 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327403 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327408 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327412 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327417 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327421 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327425 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327429 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327434 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327439 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327443 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327446 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327451 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327455 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:17.329278 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327459 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327463 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327471 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.327476 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328119 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328129 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328133 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328137 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328142 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328146 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328151 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328155 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328161 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328166 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328170 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328175 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328180 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328184 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328189 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328193 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:17.330048 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328197 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328201 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328205 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328210 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328215 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328219 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328223 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328227 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328231 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328236 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328240 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328246 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328252 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328257 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328261 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328265 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328271 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328276 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328280 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328284 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:17.330930 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328288 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328292 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328297 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328301 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328305 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328309 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328313 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328319 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328323 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328328 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328332 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328336 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328340 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328344 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328348 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328352 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328356 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328360 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328364 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:17.331618 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328371 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328377 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328382 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328387 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328394 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328399 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328404 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328408 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328412 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328417 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328421 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328425 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328429 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328433 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328437 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328441 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328446 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328450 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328454 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328458 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:17.332214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328462 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328467 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328471 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328475 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328479 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328483 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328487 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328491 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328495 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328499 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.328503 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329417 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329432 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329442 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329449 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329457 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329462 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329468 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329475 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329481 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329486 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:58:17.332709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329492 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329497 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329502 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329507 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329512 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329517 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329522 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329526 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329531 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329538 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329543 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329548 2579 flags.go:64] FLAG: --config-dir="" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329552 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329558 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329564 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329569 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329574 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329579 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329584 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329589 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329594 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329599 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329603 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329610 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329615 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:58:17.333330 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329619 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329624 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329630 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329635 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329641 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329646 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329651 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329657 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329661 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329668 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329673 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329679 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329684 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329689 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329694 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329698 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329702 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329707 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329711 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329716 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329722 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329727 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329732 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329737 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329742 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:58:17.333935 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329747 2579 flags.go:64] FLAG: --help="false" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329752 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329757 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329762 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329767 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329772 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329778 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329782 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329787 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329791 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329797 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329802 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329807 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329814 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329820 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329824 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329829 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329834 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329839 2579 flags.go:64] FLAG: --lock-file="" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329843 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329848 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329853 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329862 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:58:17.334561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329866 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329871 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329875 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329880 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329886 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329890 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329895 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329905 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329911 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329917 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329922 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329927 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329931 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329937 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329941 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329946 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329950 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329962 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329967 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329972 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329978 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.329983 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330012 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330018 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:58:17.335144 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330022 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330027 2579 flags.go:64] FLAG: --port="10250" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330032 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330036 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-083c4772dcadf0538" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330041 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330046 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330050 2579 flags.go:64] FLAG: --register-node="true" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330055 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330059 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330065 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330069 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330073 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330078 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330084 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330089 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330094 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330098 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330103 2579 flags.go:64] FLAG: --runonce="false" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330107 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330112 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330117 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330122 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330126 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330131 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330136 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330142 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:58:17.335711 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330146 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330151 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330156 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330162 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330169 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330174 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330187 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330196 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330200 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330205 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330211 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330216 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330220 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330224 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330228 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330232 2579 flags.go:64] FLAG: --v="2" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330239 2579 flags.go:64] FLAG: --version="false" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330244 2579 flags.go:64] FLAG: --vmodule="" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330251 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330256 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330397 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330404 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330409 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330413 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:17.336389 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330418 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330423 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330427 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330432 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330436 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330441 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330445 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330449 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330453 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330457 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330461 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330465 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330472 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330477 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330481 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330486 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330490 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330494 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330498 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330502 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:17.337028 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330507 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330511 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330515 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330519 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330524 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330528 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330532 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330536 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330540 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330544 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330548 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330552 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330556 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330560 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330564 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330569 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330573 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330577 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:17.337540 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330581 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330585 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330589 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330593 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330597 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330602 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330608 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330613 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330617 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330622 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330626 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330630 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330634 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330638 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330642 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330647 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330651 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330655 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330659 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330663 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:17.337980 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330667 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330671 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330675 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330679 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330683 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330687 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330694 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330700 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330704 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330709 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330713 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330717 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330722 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330726 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330730 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330734 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330739 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330745 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330752 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:17.338498 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330757 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330762 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330766 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330770 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.330775 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.330788 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.337530 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.337546 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337599 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337604 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337607 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337610 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337614 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337617 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337620 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:17.338962 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337622 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337625 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337628 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337630 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337633 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337636 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337638 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337641 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337643 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337646 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337648 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337651 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337654 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337656 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337658 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337661 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337664 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337666 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337668 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337671 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:17.339355 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337673 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337675 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337678 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337681 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337685 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337688 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337690 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337693 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337695 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337698 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337700 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337703 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337705 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337708 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337711 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337713 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337716 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337719 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337721 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:17.339825 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337724 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337727 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337729 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337732 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337734 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337737 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337740 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337742 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337744 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337747 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337749 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337752 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337754 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337756 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337759 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337762 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337765 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337767 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337771 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337773 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:17.340348 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337776 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337778 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337781 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337783 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337787 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337791 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337794 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337797 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337801 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337804 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337807 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337810 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337813 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337816 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337818 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337820 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337823 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337826 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337828 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:17.340876 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337831 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.337836 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337937 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337942 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337944 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337947 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337950 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337953 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337955 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337958 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337960 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337964 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337967 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337969 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337972 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337974 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:58:17.341367 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337977 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337979 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337981 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.337984 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338004 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338007 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338010 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338013 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338015 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338018 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338020 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338023 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338025 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338028 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338031 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338033 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338035 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338038 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338040 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338043 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:58:17.341762 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338045 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338048 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338050 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338053 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338055 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338058 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338060 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338063 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338065 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338068 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338070 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338073 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338075 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338078 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338080 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338083 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338085 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338087 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338090 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338092 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:58:17.342273 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338095 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338097 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338099 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338103 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338106 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338109 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338113 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338116 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338119 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338122 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338125 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338128 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338131 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338133 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338136 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338138 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338140 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338143 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:58:17.342752 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338145 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338148 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338150 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338152 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338155 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338158 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338160 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338163 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338165 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338167 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338170 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338172 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338175 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:17.338186 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.338192 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:58:17.343201 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.338917 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:58:17.343563 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.341578 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:58:17.343563 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.342605 2579 server.go:1019] "Starting client certificate rotation" Apr 16 20:58:17.343563 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.342704 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:58:17.343563 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.342750 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:58:17.371480 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.371461 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:58:17.373756 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.373742 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:58:17.386718 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.386690 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:58:17.392146 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.392131 2579 log.go:25] "Validated CRI v1 image API" Apr 16 20:58:17.393554 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.393540 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:58:17.398881 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.398854 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 84af319c-c70f-49c9-a78b-35fe1a58eaf8:/dev/nvme0n1p3 9e979bfb-e813-4694-bfdd-6599eade9ab5:/dev/nvme0n1p4] Apr 16 20:58:17.398983 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.398882 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:58:17.405211 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.405100 2579 manager.go:217] Machine: {Timestamp:2026-04-16 20:58:17.402905293 +0000 UTC m=+0.421216131 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098189 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2853c3aeda8eeb78850d8658307711 SystemUUID:ec2853c3-aeda-8eeb-7885-0d8658307711 BootID:dd42c16c-0d04-41a6-afb8-93a888dc8420 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:09:2f:7d:f8:65 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:09:2f:7d:f8:65 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:f5:a7:e8:10:7b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:58:17.405211 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.405202 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:58:17.405323 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.405276 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:58:17.406361 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.406338 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:58:17.406503 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.406364 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-17.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:58:17.406545 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.406513 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:58:17.406545 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.406522 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:58:17.406545 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.406535 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:58:17.407320 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.407309 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:58:17.408016 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.407978 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:58:17.409005 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.408979 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:58:17.409111 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.409101 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:58:17.411586 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.411574 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:58:17.412139 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.412129 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:58:17.412170 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.412150 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:58:17.412170 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.412159 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:58:17.412170 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.412168 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:58:17.413372 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.413359 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:58:17.413422 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.413386 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:58:17.416212 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.416184 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:58:17.418047 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.418033 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:58:17.420494 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420473 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:58:17.420494 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420497 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:58:17.420629 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420504 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:58:17.420629 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420520 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:58:17.420629 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420530 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:58:17.420629 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420539 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:58:17.420629 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420547 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:58:17.420629 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420556 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:58:17.420629 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420567 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:58:17.420629 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420576 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:58:17.420958 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420767 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:58:17.420958 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.420782 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:58:17.422374 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.422358 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:58:17.422442 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.422389 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:58:17.426681 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.426666 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:58:17.426769 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.426735 2579 server.go:1295] "Started kubelet" Apr 16 20:58:17.426865 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.426815 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:58:17.426947 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.426883 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:58:17.426981 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.426958 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:58:17.427264 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.427245 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:58:17.427317 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.427281 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-17.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:58:17.427427 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.427407 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-17.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:58:17.427497 ip-10-0-139-17 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:58:17.429731 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.429716 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:58:17.430601 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.430585 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:58:17.435543 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.435526 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:58:17.435635 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.435586 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:58:17.436027 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.435983 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kmb67" Apr 16 20:58:17.436248 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436232 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:58:17.436304 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436254 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:58:17.436351 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436338 2579 factory.go:55] Registering systemd factory Apr 16 20:58:17.436408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436388 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:58:17.436408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436399 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:58:17.436503 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436449 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:58:17.436503 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436457 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:58:17.436624 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.436556 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:17.436624 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436614 2579 factory.go:153] Registering CRI-O factory Apr 16 20:58:17.436700 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436628 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 20:58:17.436736 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436707 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:58:17.436736 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436724 2579 factory.go:103] Registering Raw factory Apr 16 20:58:17.436846 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.436744 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 20:58:17.437276 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.437255 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:58:17.437276 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.437270 2579 manager.go:319] Starting recovery of all containers Apr 16 20:58:17.437532 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.436615 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-17.ec2.internal.18a6f1eba4ad7735 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-17.ec2.internal,UID:ip-10-0-139-17.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-17.ec2.internal,},FirstTimestamp:2026-04-16 20:58:17.426679605 +0000 UTC m=+0.444990442,LastTimestamp:2026-04-16 20:58:17.426679605 +0000 UTC m=+0.444990442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-17.ec2.internal,}" Apr 16 20:58:17.444263 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.444118 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kmb67" Apr 16 20:58:17.446279 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.446263 2579 manager.go:324] Recovery completed Apr 16 20:58:17.446524 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.446496 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-17.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:58:17.446524 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.446503 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:58:17.448694 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.448673 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 20:58:17.451825 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.451813 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:17.454410 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.454396 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:17.454488 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.454423 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:17.454488 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.454433 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:17.454938 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.454926 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:58:17.454938 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.454936 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:58:17.455038 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.454950 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:58:17.457581 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.457570 2579 policy_none.go:49] "None policy: Start" Apr 16 20:58:17.457626 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.457585 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:58:17.457626 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.457595 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.494273 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.494303 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.494315 2579 server.go:85] "Starting device plugin registration server" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.494538 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.494549 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.494616 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.494689 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.494699 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.495204 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:58:17.506332 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.495242 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:17.560134 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.560103 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:58:17.561617 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.561601 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:58:17.561694 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.561643 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:58:17.561694 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.561667 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:58:17.561694 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.561676 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:58:17.561907 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.561715 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:58:17.564386 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.564371 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:17.594871 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.594823 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:17.595646 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.595632 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:17.595696 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.595659 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:17.595696 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.595670 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:17.595696 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.595693 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.604175 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.604159 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.604231 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.604181 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-17.ec2.internal\": node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:17.633113 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.633094 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:17.662787 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.662769 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal"] Apr 16 20:58:17.662845 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.662827 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:17.663491 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.663479 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:17.663535 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.663506 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:17.663535 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.663520 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:17.664804 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.664792 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:17.664952 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.664939 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.665004 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.664969 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:17.665445 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.665424 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:17.665445 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.665440 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:17.665564 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.665455 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:17.665564 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.665466 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:17.665564 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.665470 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:17.665564 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.665476 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:17.666433 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.666421 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.666498 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.666442 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:58:17.667088 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.667070 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:58:17.667173 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.667101 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:58:17.667173 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.667115 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:58:17.691521 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.691507 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-17.ec2.internal\" not found" node="ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.695686 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.695673 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-17.ec2.internal\" not found" node="ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.733392 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.733371 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:17.737675 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.737657 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28baa9a6a9685b4a1a26f3b0f5164f85-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal\" (UID: \"28baa9a6a9685b4a1a26f3b0f5164f85\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.737735 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.737683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28baa9a6a9685b4a1a26f3b0f5164f85-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal\" (UID: \"28baa9a6a9685b4a1a26f3b0f5164f85\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.737735 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.737702 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5202bb430809412b670f0c968baf10f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-17.ec2.internal\" (UID: \"a5202bb430809412b670f0c968baf10f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.834430 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.834385 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:17.838717 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.838699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28baa9a6a9685b4a1a26f3b0f5164f85-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal\" (UID: \"28baa9a6a9685b4a1a26f3b0f5164f85\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.838791 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.838729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28baa9a6a9685b4a1a26f3b0f5164f85-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal\" (UID: \"28baa9a6a9685b4a1a26f3b0f5164f85\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.838791 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.838754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5202bb430809412b670f0c968baf10f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-17.ec2.internal\" (UID: \"a5202bb430809412b670f0c968baf10f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.838791 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.838786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28baa9a6a9685b4a1a26f3b0f5164f85-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal\" (UID: \"28baa9a6a9685b4a1a26f3b0f5164f85\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.838910 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.838796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28baa9a6a9685b4a1a26f3b0f5164f85-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal\" (UID: \"28baa9a6a9685b4a1a26f3b0f5164f85\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.838910 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.838788 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5202bb430809412b670f0c968baf10f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-17.ec2.internal\" (UID: \"a5202bb430809412b670f0c968baf10f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.934921 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:17.934863 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:17.993373 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.993335 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:17.997736 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:17.997720 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" Apr 16 20:58:18.035225 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:18.035196 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:18.135712 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:18.135689 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:18.236177 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:18.236122 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:18.336547 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:18.336515 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:18.342870 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.342850 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:58:18.343032 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.343015 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:58:18.436074 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.436045 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:58:18.436578 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:18.436562 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:18.446568 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.446534 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:53:17 +0000 UTC" deadline="2027-09-15 00:14:03.944475542 +0000 UTC" Apr 16 20:58:18.446568 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.446565 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12387h15m45.497913359s" Apr 16 20:58:18.452405 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.452385 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:58:18.469397 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:18.469372 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28baa9a6a9685b4a1a26f3b0f5164f85.slice/crio-2b916aa0b6ac7791ab9ccd0d518e888db78d1e3f3b474ed93efd3d1542be04e4 WatchSource:0}: Error finding container 2b916aa0b6ac7791ab9ccd0d518e888db78d1e3f3b474ed93efd3d1542be04e4: Status 404 returned error can't find the container with id 2b916aa0b6ac7791ab9ccd0d518e888db78d1e3f3b474ed93efd3d1542be04e4 Apr 16 20:58:18.469819 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:18.469800 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5202bb430809412b670f0c968baf10f.slice/crio-e9184b65cc5483b3e26241bdd56a89695422d345f844f0b858beba13e6decb57 WatchSource:0}: Error finding container e9184b65cc5483b3e26241bdd56a89695422d345f844f0b858beba13e6decb57: Status 404 returned error can't find the container with id e9184b65cc5483b3e26241bdd56a89695422d345f844f0b858beba13e6decb57 Apr 16 20:58:18.474464 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.474442 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:58:18.477857 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.477840 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m66np" Apr 16 20:58:18.487784 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.487741 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m66np" Apr 16 20:58:18.537456 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:18.537427 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-17.ec2.internal\" not found" Apr 16 20:58:18.544001 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.543977 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:18.564374 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.564331 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" event={"ID":"a5202bb430809412b670f0c968baf10f","Type":"ContainerStarted","Data":"e9184b65cc5483b3e26241bdd56a89695422d345f844f0b858beba13e6decb57"} Apr 16 20:58:18.565239 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.565214 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" event={"ID":"28baa9a6a9685b4a1a26f3b0f5164f85","Type":"ContainerStarted","Data":"2b916aa0b6ac7791ab9ccd0d518e888db78d1e3f3b474ed93efd3d1542be04e4"} Apr 16 20:58:18.636417 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.636398 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" Apr 16 20:58:18.652861 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.652843 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:58:18.654396 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.654384 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" Apr 16 20:58:18.664479 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.664457 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:58:18.965249 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.965170 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:18.969527 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:18.969366 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:19.413289 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.413210 2579 apiserver.go:52] "Watching apiserver" Apr 16 20:58:19.423039 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.423017 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:58:19.423376 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.423353 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-ktmcs","kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99","openshift-image-registry/node-ca-2g5dj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal","openshift-multus/multus-additional-cni-plugins-wrfqb","openshift-multus/network-metrics-daemon-rgzx9","openshift-network-diagnostics/network-check-target-z494p","openshift-cluster-node-tuning-operator/tuned-z2jsq","openshift-dns/node-resolver-qdgnp","openshift-multus/multus-9zljx","openshift-network-operator/iptables-alerter-kcrcc","openshift-ovn-kubernetes/ovnkube-node-s6cxs"] Apr 16 20:58:19.425360 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.425332 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:19.425456 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:19.425433 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:19.426485 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.426465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.427708 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.427678 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.428687 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.428659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.430081 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.429600 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:58:19.430081 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.429650 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:58:19.430081 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.429650 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:58:19.430081 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.429731 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-947t8\"" Apr 16 20:58:19.430081 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.429787 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:19.430790 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.430773 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:58:19.430790 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.430782 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:58:19.430918 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.430842 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d5c29\"" Apr 16 20:58:19.431039 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.431024 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:58:19.431125 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.431109 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:58:19.432137 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432114 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:58:19.432137 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432129 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-446tz\"" Apr 16 20:58:19.432283 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432146 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:19.432283 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:19.432202 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:19.432366 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432358 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:58:19.432481 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432465 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:58:19.432528 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432482 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:58:19.432578 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432542 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:58:19.432663 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432649 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cd57d\"" Apr 16 20:58:19.432663 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.432361 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:58:19.433355 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.433339 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.434526 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.434508 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.434675 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.434659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.435931 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.435914 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.437791 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.437769 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.438043 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.438023 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:58:19.439366 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.439345 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:58:19.439459 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.439433 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7lsts\"" Apr 16 20:58:19.439560 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.439543 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xrk5p\"" Apr 16 20:58:19.439611 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.439570 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:58:19.439684 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.439576 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:58:19.439732 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.439685 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-48sqq\"" Apr 16 20:58:19.440297 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.439912 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:58:19.440297 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.439953 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:58:19.440297 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.440059 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:58:19.440297 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.440232 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s6qxs\"" Apr 16 20:58:19.440297 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.440250 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:58:19.440297 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.440286 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:58:19.440721 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.440705 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qbzjk\"" Apr 16 20:58:19.441757 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.441740 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:58:19.443142 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.443125 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:58:19.443407 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.443387 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:58:19.443407 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.443401 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:58:19.443535 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.443446 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:58:19.446163 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovnkube-config\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.446262 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllb2\" (UniqueName: \"kubernetes.io/projected/6729dda1-3d66-4a8a-a99c-69840130dbf7-kube-api-access-zllb2\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.446262 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-lib-modules\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.446262 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446226 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-tuned\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.446262 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446249 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-os-release\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.446467 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446269 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-conf-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.446467 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-etc-kubernetes\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.446467 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446314 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-var-lib-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.446467 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-tmp-dir\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.446467 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446386 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-sys-fs\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.446467 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446402 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysconfig\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.446467 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446439 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-socket-dir-parent\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.446467 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446463 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lslp2\" (UniqueName: \"kubernetes.io/projected/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-kube-api-access-lslp2\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-systemd\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446519 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-cni-bin\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c26f028d-d200-4cbf-b0a5-93e063163a20-agent-certs\") pod \"konnectivity-agent-ktmcs\" (UID: \"c26f028d-d200-4cbf-b0a5-93e063163a20\") " pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-kubernetes\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446597 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-kubelet\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446622 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446649 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446679 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e692ce1d-b283-4350-8079-a219afe643ce-host-slash\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446722 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-socket-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446789 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c26f028d-d200-4cbf-b0a5-93e063163a20-konnectivity-ca\") pod \"konnectivity-agent-ktmcs\" (UID: \"c26f028d-d200-4cbf-b0a5-93e063163a20\") " pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446818 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:19.446842 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-cni-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-systemd-units\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-host\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446904 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-cnibin\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-cni-binary-copy\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.446968 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-cni-multus\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447006 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0885955-99b6-48e3-86a9-df98a742d1e2-tmp\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447028 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms5kn\" (UniqueName: \"kubernetes.io/projected/b0885955-99b6-48e3-86a9-df98a742d1e2-kube-api-access-ms5kn\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447042 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-os-release\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447069 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-ovn\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447101 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447130 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-etc-selinux\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovnkube-script-lib\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447225 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-run\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.447299 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-netns\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-cni-bin\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-run-netns\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447326 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-cni-netd\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-system-cni-dir\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447372 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-cnibin\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447395 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-registration-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447412 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-multus-certs\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447426 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-systemd\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447444 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-var-lib-kubelet\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-system-cni-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-etc-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-hosts-file\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e692ce1d-b283-4350-8079-a219afe643ce-iptables-alerter-script\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqxc\" (UniqueName: \"kubernetes.io/projected/e692ce1d-b283-4350-8079-a219afe643ce-kube-api-access-jbqxc\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447596 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovn-node-metrics-cert\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.447778 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mm2l\" (UniqueName: \"kubernetes.io/projected/24ed89ef-c93c-40fd-a75f-2f3fd7582359-kube-api-access-7mm2l\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447643 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jtql\" (UniqueName: \"kubernetes.io/projected/157d2848-acb1-4db0-bb7b-a50ad66888da-kube-api-access-7jtql\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447672 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc5pg\" (UniqueName: \"kubernetes.io/projected/82cec601-1fa2-462e-b238-35fcb7bcc8fe-kube-api-access-rc5pg\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447696 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq86w\" (UniqueName: \"kubernetes.io/projected/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-kube-api-access-gq86w\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447716 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-k8s-cni-cncf-io\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-env-overrides\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447775 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-device-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447790 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-serviceca\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447828 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysctl-d\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysctl-conf\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447855 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-sys\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447888 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-host\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447920 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-hostroot\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-daemon-config\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-log-socket\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.448342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.447981 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-slash\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.448915 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.448030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-kubelet\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.448915 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.448058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.448915 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.448081 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdqzf\" (UniqueName: \"kubernetes.io/projected/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-kube-api-access-vdqzf\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.448915 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.448111 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-modprobe-d\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.448915 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.448138 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-node-log\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.488846 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.488799 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:53:18 +0000 UTC" deadline="2027-09-10 20:35:47.285151033 +0000 UTC" Apr 16 20:58:19.488846 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.488836 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12287h37m27.796318329s" Apr 16 20:58:19.537446 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.537418 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:58:19.549192 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549165 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0885955-99b6-48e3-86a9-df98a742d1e2-tmp\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms5kn\" (UniqueName: \"kubernetes.io/projected/b0885955-99b6-48e3-86a9-df98a742d1e2-kube-api-access-ms5kn\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-os-release\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549250 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-ovn\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549281 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-etc-selinux\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovnkube-script-lib\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549333 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-run\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:19.549360 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549379 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-netns\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.549419 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-cni-bin\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:19.549443 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:20.049421634 +0000 UTC m=+3.067732478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549479 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-cni-bin\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-run-netns\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549495 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-etc-selinux\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549526 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549548 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-cni-netd\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-run-netns\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-os-release\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-ovn\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-run\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549729 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-netns\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-cni-netd\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-system-cni-dir\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549835 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.549984 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-cnibin\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549876 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-system-cni-dir\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-registration-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-multus-certs\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549911 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157d2848-acb1-4db0-bb7b-a50ad66888da-cnibin\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549938 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-systemd\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-multus-certs\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.549963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-var-lib-kubelet\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-system-cni-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550028 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovnkube-script-lib\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550050 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-systemd\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550007 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-registration-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-etc-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550088 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-var-lib-kubelet\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-system-cni-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-hosts-file\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-etc-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.550864 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e692ce1d-b283-4350-8079-a219afe643ce-iptables-alerter-script\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550171 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-hosts-file\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqxc\" (UniqueName: \"kubernetes.io/projected/e692ce1d-b283-4350-8079-a219afe643ce-kube-api-access-jbqxc\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovn-node-metrics-cert\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mm2l\" (UniqueName: \"kubernetes.io/projected/24ed89ef-c93c-40fd-a75f-2f3fd7582359-kube-api-access-7mm2l\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jtql\" (UniqueName: \"kubernetes.io/projected/157d2848-acb1-4db0-bb7b-a50ad66888da-kube-api-access-7jtql\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rc5pg\" (UniqueName: \"kubernetes.io/projected/82cec601-1fa2-462e-b238-35fcb7bcc8fe-kube-api-access-rc5pg\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq86w\" (UniqueName: \"kubernetes.io/projected/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-kube-api-access-gq86w\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550366 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-k8s-cni-cncf-io\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-env-overrides\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550423 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-device-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-serviceca\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550476 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysctl-d\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550499 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysctl-conf\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-sys\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550584 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-host\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.551699 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e692ce1d-b283-4350-8079-a219afe643ce-iptables-alerter-script\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-hostroot\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-daemon-config\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-log-socket\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550697 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-slash\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550722 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-kubelet\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550777 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdqzf\" (UniqueName: \"kubernetes.io/projected/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-kube-api-access-vdqzf\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-serviceca\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-run-k8s-cni-cncf-io\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.550918 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-device-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551099 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-hostroot\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysctl-d\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551163 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-sys\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-slash\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-host\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551208 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-kubelet\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-log-socket\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.552570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551256 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-env-overrides\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551352 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysctl-conf\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551362 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-modprobe-d\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551391 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-node-log\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovnkube-config\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zllb2\" (UniqueName: \"kubernetes.io/projected/6729dda1-3d66-4a8a-a99c-69840130dbf7-kube-api-access-zllb2\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-lib-modules\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551489 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-tuned\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-os-release\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-modprobe-d\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-conf-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551570 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-node-log\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-etc-kubernetes\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-var-lib-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-tmp-dir\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-sys-fs\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551685 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysconfig\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.553408 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-socket-dir-parent\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lslp2\" (UniqueName: \"kubernetes.io/projected/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-kube-api-access-lslp2\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-daemon-config\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-systemd\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551808 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-cni-bin\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551810 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-var-lib-openvswitch\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c26f028d-d200-4cbf-b0a5-93e063163a20-agent-certs\") pod \"konnectivity-agent-ktmcs\" (UID: \"c26f028d-d200-4cbf-b0a5-93e063163a20\") " pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-kubernetes\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551870 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-os-release\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-kubelet\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-socket-dir-parent\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-conf-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e692ce1d-b283-4350-8079-a219afe643ce-host-slash\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-socket-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.554253 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c26f028d-d200-4cbf-b0a5-93e063163a20-konnectivity-ca\") pod \"konnectivity-agent-ktmcs\" (UID: \"c26f028d-d200-4cbf-b0a5-93e063163a20\") " pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552090 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-kubernetes\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-cni-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-kubelet\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-systemd-units\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552174 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-tmp-dir\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552181 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-host\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552183 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-cnibin\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552237 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-cni-binary-copy\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552262 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-sysconfig\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-cni-multus\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552313 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-run-systemd\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-host-var-lib-cni-multus\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.551564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-etc-kubernetes\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552444 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e692ce1d-b283-4350-8079-a219afe643ce-host-slash\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.554746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovnkube-config\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-systemd-units\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-cnibin\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552604 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-socket-dir\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/157d2848-acb1-4db0-bb7b-a50ad66888da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-host\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552723 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0885955-99b6-48e3-86a9-df98a742d1e2-lib-modules\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552625 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/82cec601-1fa2-462e-b238-35fcb7bcc8fe-sys-fs\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552833 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-multus-cni-dir\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.552858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6729dda1-3d66-4a8a-a99c-69840130dbf7-host-cni-bin\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.553082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-cni-binary-copy\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.553847 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c26f028d-d200-4cbf-b0a5-93e063163a20-konnectivity-ca\") pod \"konnectivity-agent-ktmcs\" (UID: \"c26f028d-d200-4cbf-b0a5-93e063163a20\") " pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:19.555413 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.554197 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6729dda1-3d66-4a8a-a99c-69840130dbf7-ovn-node-metrics-cert\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.556083 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.556061 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0885955-99b6-48e3-86a9-df98a742d1e2-etc-tuned\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.556202 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.556093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0885955-99b6-48e3-86a9-df98a742d1e2-tmp\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.556272 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.556254 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c26f028d-d200-4cbf-b0a5-93e063163a20-agent-certs\") pod \"konnectivity-agent-ktmcs\" (UID: \"c26f028d-d200-4cbf-b0a5-93e063163a20\") " pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:19.560099 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.559938 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqxc\" (UniqueName: \"kubernetes.io/projected/e692ce1d-b283-4350-8079-a219afe643ce-kube-api-access-jbqxc\") pod \"iptables-alerter-kcrcc\" (UID: \"e692ce1d-b283-4350-8079-a219afe643ce\") " pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.562929 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.561780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jtql\" (UniqueName: \"kubernetes.io/projected/157d2848-acb1-4db0-bb7b-a50ad66888da-kube-api-access-7jtql\") pod \"multus-additional-cni-plugins-wrfqb\" (UID: \"157d2848-acb1-4db0-bb7b-a50ad66888da\") " pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.563100 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.563078 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc5pg\" (UniqueName: \"kubernetes.io/projected/82cec601-1fa2-462e-b238-35fcb7bcc8fe-kube-api-access-rc5pg\") pod \"aws-ebs-csi-driver-node-tmn99\" (UID: \"82cec601-1fa2-462e-b238-35fcb7bcc8fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.563241 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:19.563223 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:19.563303 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:19.563248 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:19.563303 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:19.563262 2579 projected.go:194] Error preparing data for projected volume kube-api-access-bsl2d for pod openshift-network-diagnostics/network-check-target-z494p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:19.563404 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:19.563336 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d podName:b7d5e3db-123c-4df1-8f35-413bcf697e6f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:20.063316918 +0000 UTC m=+3.081627743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bsl2d" (UniqueName: "kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d") pod "network-check-target-z494p" (UID: "b7d5e3db-123c-4df1-8f35-413bcf697e6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:19.563826 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.563800 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllb2\" (UniqueName: \"kubernetes.io/projected/6729dda1-3d66-4a8a-a99c-69840130dbf7-kube-api-access-zllb2\") pod \"ovnkube-node-s6cxs\" (UID: \"6729dda1-3d66-4a8a-a99c-69840130dbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.563979 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.563955 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mm2l\" (UniqueName: \"kubernetes.io/projected/24ed89ef-c93c-40fd-a75f-2f3fd7582359-kube-api-access-7mm2l\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:19.565422 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.565399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lslp2\" (UniqueName: \"kubernetes.io/projected/13be4f93-01b1-4633-a3f1-b9d89ab4fed8-kube-api-access-lslp2\") pod \"multus-9zljx\" (UID: \"13be4f93-01b1-4633-a3f1-b9d89ab4fed8\") " pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.565507 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.565441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms5kn\" (UniqueName: \"kubernetes.io/projected/b0885955-99b6-48e3-86a9-df98a742d1e2-kube-api-access-ms5kn\") pod \"tuned-z2jsq\" (UID: \"b0885955-99b6-48e3-86a9-df98a742d1e2\") " pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.566201 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.566172 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdqzf\" (UniqueName: \"kubernetes.io/projected/86523bd2-ac21-4e5d-8cfd-81eb7aa5f405-kube-api-access-vdqzf\") pod \"node-resolver-qdgnp\" (UID: \"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405\") " pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.566812 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.566788 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq86w\" (UniqueName: \"kubernetes.io/projected/eb89d7f0-f4fb-459a-8f19-5b03adcf660a-kube-api-access-gq86w\") pod \"node-ca-2g5dj\" (UID: \"eb89d7f0-f4fb-459a-8f19-5b03adcf660a\") " pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.737565 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.737474 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" Apr 16 20:58:19.746329 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.746302 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2g5dj" Apr 16 20:58:19.754063 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.754042 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" Apr 16 20:58:19.759610 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.759587 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:19.766997 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.766974 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" Apr 16 20:58:19.772547 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.772520 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qdgnp" Apr 16 20:58:19.780103 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.780086 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9zljx" Apr 16 20:58:19.786641 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.786616 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kcrcc" Apr 16 20:58:19.792230 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.792209 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:19.934798 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:19.934767 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:58:20.055476 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.055450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:20.055593 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:20.055576 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:20.055646 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:20.055636 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:21.055622465 +0000 UTC m=+4.073933295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:20.069362 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.069325 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6729dda1_3d66_4a8a_a99c_69840130dbf7.slice/crio-c00219927c154cf66d26cc65563d2a3586a9a7d81f8b9886cdf1a4b3ff90335c WatchSource:0}: Error finding container c00219927c154cf66d26cc65563d2a3586a9a7d81f8b9886cdf1a4b3ff90335c: Status 404 returned error can't find the container with id c00219927c154cf66d26cc65563d2a3586a9a7d81f8b9886cdf1a4b3ff90335c Apr 16 20:58:20.071037 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.070892 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692ce1d_b283_4350_8079_a219afe643ce.slice/crio-cb6e3909f64eca9fa456c7110ca434967f4eedd449655ba1421a803d595ba7f7 WatchSource:0}: Error finding container cb6e3909f64eca9fa456c7110ca434967f4eedd449655ba1421a803d595ba7f7: Status 404 returned error can't find the container with id cb6e3909f64eca9fa456c7110ca434967f4eedd449655ba1421a803d595ba7f7 Apr 16 20:58:20.072531 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.071729 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc26f028d_d200_4cbf_b0a5_93e063163a20.slice/crio-a2caa63d7ab68efacc12a8d50375ea5793ed5c169ddff1fe59398a4bb6176407 WatchSource:0}: Error finding container a2caa63d7ab68efacc12a8d50375ea5793ed5c169ddff1fe59398a4bb6176407: Status 404 returned error can't find the container with id a2caa63d7ab68efacc12a8d50375ea5793ed5c169ddff1fe59398a4bb6176407 Apr 16 20:58:20.076228 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.076204 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86523bd2_ac21_4e5d_8cfd_81eb7aa5f405.slice/crio-57c044bd6dec72eecfb8f5c617b53a6a1532a5eded00a6720aed0c6b7cdd4467 WatchSource:0}: Error finding container 57c044bd6dec72eecfb8f5c617b53a6a1532a5eded00a6720aed0c6b7cdd4467: Status 404 returned error can't find the container with id 57c044bd6dec72eecfb8f5c617b53a6a1532a5eded00a6720aed0c6b7cdd4467 Apr 16 20:58:20.078214 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.078196 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82cec601_1fa2_462e_b238_35fcb7bcc8fe.slice/crio-9a863ddf7c147316ed80b6b4a9a55d11bf479a2e688657d0393726f0cdb1a173 WatchSource:0}: Error finding container 9a863ddf7c147316ed80b6b4a9a55d11bf479a2e688657d0393726f0cdb1a173: Status 404 returned error can't find the container with id 9a863ddf7c147316ed80b6b4a9a55d11bf479a2e688657d0393726f0cdb1a173 Apr 16 20:58:20.078567 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.078548 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb89d7f0_f4fb_459a_8f19_5b03adcf660a.slice/crio-c8cfdae696a233bea464052931fb8bbc55abb2fd043e3f8349445c6f957e94c3 WatchSource:0}: Error finding container c8cfdae696a233bea464052931fb8bbc55abb2fd043e3f8349445c6f957e94c3: Status 404 returned error can't find the container with id c8cfdae696a233bea464052931fb8bbc55abb2fd043e3f8349445c6f957e94c3 Apr 16 20:58:20.080176 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.080157 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0885955_99b6_48e3_86a9_df98a742d1e2.slice/crio-9b251241a8d065d88772455b5c09ea3d90f3102af9167e232c08155b619e8bf2 WatchSource:0}: Error finding container 9b251241a8d065d88772455b5c09ea3d90f3102af9167e232c08155b619e8bf2: Status 404 returned error can't find the container with id 9b251241a8d065d88772455b5c09ea3d90f3102af9167e232c08155b619e8bf2 Apr 16 20:58:20.081720 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.081692 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157d2848_acb1_4db0_bb7b_a50ad66888da.slice/crio-0253f933e2661596e5914edd6b708c713cc192373ed6d02bcb45612eea1ac48c WatchSource:0}: Error finding container 0253f933e2661596e5914edd6b708c713cc192373ed6d02bcb45612eea1ac48c: Status 404 returned error can't find the container with id 0253f933e2661596e5914edd6b708c713cc192373ed6d02bcb45612eea1ac48c Apr 16 20:58:20.082349 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:20.082319 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13be4f93_01b1_4633_a3f1_b9d89ab4fed8.slice/crio-e2e64bc23a34d53e43b4ea65c201f4dafb577bbae7c62981447c7a8199d69698 WatchSource:0}: Error finding container e2e64bc23a34d53e43b4ea65c201f4dafb577bbae7c62981447c7a8199d69698: Status 404 returned error can't find the container with id e2e64bc23a34d53e43b4ea65c201f4dafb577bbae7c62981447c7a8199d69698 Apr 16 20:58:20.156118 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.155961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:20.156209 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:20.156097 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:20.156209 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:20.156203 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:20.156290 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:20.156215 2579 projected.go:194] Error preparing data for projected volume kube-api-access-bsl2d for pod openshift-network-diagnostics/network-check-target-z494p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:20.156290 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:20.156258 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d podName:b7d5e3db-123c-4df1-8f35-413bcf697e6f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:21.156246121 +0000 UTC m=+4.174556947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bsl2d" (UniqueName: "kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d") pod "network-check-target-z494p" (UID: "b7d5e3db-123c-4df1-8f35-413bcf697e6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:20.489195 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.489085 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:53:18 +0000 UTC" deadline="2027-10-30 06:19:01.519543562 +0000 UTC" Apr 16 20:58:20.489195 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.489124 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13473h20m41.030423324s" Apr 16 20:58:20.574801 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.574763 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2g5dj" event={"ID":"eb89d7f0-f4fb-459a-8f19-5b03adcf660a","Type":"ContainerStarted","Data":"c8cfdae696a233bea464052931fb8bbc55abb2fd043e3f8349445c6f957e94c3"} Apr 16 20:58:20.580936 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.579424 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9zljx" event={"ID":"13be4f93-01b1-4633-a3f1-b9d89ab4fed8","Type":"ContainerStarted","Data":"e2e64bc23a34d53e43b4ea65c201f4dafb577bbae7c62981447c7a8199d69698"} Apr 16 20:58:20.586028 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.582525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qdgnp" event={"ID":"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405","Type":"ContainerStarted","Data":"57c044bd6dec72eecfb8f5c617b53a6a1532a5eded00a6720aed0c6b7cdd4467"} Apr 16 20:58:20.587341 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.587316 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ktmcs" event={"ID":"c26f028d-d200-4cbf-b0a5-93e063163a20","Type":"ContainerStarted","Data":"a2caa63d7ab68efacc12a8d50375ea5793ed5c169ddff1fe59398a4bb6176407"} Apr 16 20:58:20.588794 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.588770 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"c00219927c154cf66d26cc65563d2a3586a9a7d81f8b9886cdf1a4b3ff90335c"} Apr 16 20:58:20.591880 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.591204 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" event={"ID":"a5202bb430809412b670f0c968baf10f","Type":"ContainerStarted","Data":"3bd624212b1cb9f0dd9ccc487a64b26be40f28360b117885519f88e0caf2350b"} Apr 16 20:58:20.598387 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.598254 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" event={"ID":"157d2848-acb1-4db0-bb7b-a50ad66888da","Type":"ContainerStarted","Data":"0253f933e2661596e5914edd6b708c713cc192373ed6d02bcb45612eea1ac48c"} Apr 16 20:58:20.604513 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.604486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" event={"ID":"b0885955-99b6-48e3-86a9-df98a742d1e2","Type":"ContainerStarted","Data":"9b251241a8d065d88772455b5c09ea3d90f3102af9167e232c08155b619e8bf2"} Apr 16 20:58:20.609462 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.609390 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-17.ec2.internal" podStartSLOduration=2.609376122 podStartE2EDuration="2.609376122s" podCreationTimestamp="2026-04-16 20:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:58:20.609302313 +0000 UTC m=+3.627613161" watchObservedRunningTime="2026-04-16 20:58:20.609376122 +0000 UTC m=+3.627686972" Apr 16 20:58:20.611107 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.611081 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" event={"ID":"82cec601-1fa2-462e-b238-35fcb7bcc8fe","Type":"ContainerStarted","Data":"9a863ddf7c147316ed80b6b4a9a55d11bf479a2e688657d0393726f0cdb1a173"} Apr 16 20:58:20.612938 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:20.612914 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kcrcc" event={"ID":"e692ce1d-b283-4350-8079-a219afe643ce","Type":"ContainerStarted","Data":"cb6e3909f64eca9fa456c7110ca434967f4eedd449655ba1421a803d595ba7f7"} Apr 16 20:58:21.068234 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:21.067517 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:21.068234 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:21.067645 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:21.068234 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:21.067707 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:23.067688178 +0000 UTC m=+6.085999012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:21.168858 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:21.168232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:21.168858 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:21.168433 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:21.168858 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:21.168454 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:21.168858 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:21.168466 2579 projected.go:194] Error preparing data for projected volume kube-api-access-bsl2d for pod openshift-network-diagnostics/network-check-target-z494p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:21.168858 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:21.168523 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d podName:b7d5e3db-123c-4df1-8f35-413bcf697e6f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:23.168505451 +0000 UTC m=+6.186816291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bsl2d" (UniqueName: "kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d") pod "network-check-target-z494p" (UID: "b7d5e3db-123c-4df1-8f35-413bcf697e6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:21.561953 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:21.561920 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:21.562399 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:21.562066 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:21.562399 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:21.562157 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:21.562399 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:21.562265 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:21.638981 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:21.638942 2579 generic.go:358] "Generic (PLEG): container finished" podID="28baa9a6a9685b4a1a26f3b0f5164f85" containerID="48e75efacb197c68b851b5fc4815e3aa8bef29d8049a886bb4571eced8569114" exitCode=0 Apr 16 20:58:21.640102 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:21.639868 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" event={"ID":"28baa9a6a9685b4a1a26f3b0f5164f85","Type":"ContainerDied","Data":"48e75efacb197c68b851b5fc4815e3aa8bef29d8049a886bb4571eced8569114"} Apr 16 20:58:22.647306 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:22.646663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" event={"ID":"28baa9a6a9685b4a1a26f3b0f5164f85","Type":"ContainerStarted","Data":"ae577a86d31f0c6debcdae0575274d91b0d3c26b5c564f6f1eb7604483834ffe"} Apr 16 20:58:22.661848 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:22.661791 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-17.ec2.internal" podStartSLOduration=4.661771261 podStartE2EDuration="4.661771261s" podCreationTimestamp="2026-04-16 20:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:58:22.661546709 +0000 UTC m=+5.679857563" watchObservedRunningTime="2026-04-16 20:58:22.661771261 +0000 UTC m=+5.680082109" Apr 16 20:58:23.090151 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:23.090066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:23.090326 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:23.090246 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:23.090326 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:23.090311 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:27.09029092 +0000 UTC m=+10.108601749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:23.191067 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:23.191027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:23.191240 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:23.191227 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:23.191321 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:23.191247 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:23.191321 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:23.191262 2579 projected.go:194] Error preparing data for projected volume kube-api-access-bsl2d for pod openshift-network-diagnostics/network-check-target-z494p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:23.191321 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:23.191318 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d podName:b7d5e3db-123c-4df1-8f35-413bcf697e6f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:27.191299871 +0000 UTC m=+10.209610695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-bsl2d" (UniqueName: "kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d") pod "network-check-target-z494p" (UID: "b7d5e3db-123c-4df1-8f35-413bcf697e6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:23.563211 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:23.563128 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:23.563211 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:23.563179 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:23.563426 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:23.563354 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:23.563426 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:23.563402 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:25.562628 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:25.562388 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:25.562628 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:25.562512 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:25.563080 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:25.563039 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:25.563172 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:25.563151 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:27.121542 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:27.121500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:27.121963 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:27.121645 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:27.121963 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:27.121709 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:35.121689559 +0000 UTC m=+18.140000399 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:27.222550 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:27.222514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:27.222759 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:27.222737 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:27.222833 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:27.222763 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:27.222833 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:27.222773 2579 projected.go:194] Error preparing data for projected volume kube-api-access-bsl2d for pod openshift-network-diagnostics/network-check-target-z494p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:27.222833 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:27.222818 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d podName:b7d5e3db-123c-4df1-8f35-413bcf697e6f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:35.222804827 +0000 UTC m=+18.241115666 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-bsl2d" (UniqueName: "kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d") pod "network-check-target-z494p" (UID: "b7d5e3db-123c-4df1-8f35-413bcf697e6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:27.563558 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:27.563293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:27.563558 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:27.563388 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:27.563558 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:27.563403 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:27.563558 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:27.563516 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:29.562912 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:29.562866 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:29.563367 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:29.562866 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:29.563367 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:29.563030 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:29.563367 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:29.563118 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:31.562331 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:31.562300 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:31.562331 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:31.562327 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:31.562781 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:31.562399 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:31.562781 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:31.562538 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:33.562282 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:33.562245 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:33.562750 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:33.562362 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:33.562750 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:33.562423 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:33.562750 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:33.562550 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:35.186660 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:35.186628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:35.187132 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:35.186755 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:35.187132 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:35.186817 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:51.186799453 +0000 UTC m=+34.205110279 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:35.286982 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:35.286945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:35.287150 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:35.287131 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:35.287205 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:35.287161 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:35.287205 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:35.287176 2579 projected.go:194] Error preparing data for projected volume kube-api-access-bsl2d for pod openshift-network-diagnostics/network-check-target-z494p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:35.287282 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:35.287238 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d podName:b7d5e3db-123c-4df1-8f35-413bcf697e6f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:51.287220176 +0000 UTC m=+34.305531005 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-bsl2d" (UniqueName: "kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d") pod "network-check-target-z494p" (UID: "b7d5e3db-123c-4df1-8f35-413bcf697e6f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:35.562557 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:35.562523 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:35.562745 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:35.562523 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:35.562745 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:35.562669 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:35.562745 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:35.562729 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:37.563384 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:37.563356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:37.563644 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:37.563424 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:37.563644 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:37.563511 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:37.563718 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:37.563640 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:37.682068 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:37.681715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" event={"ID":"b0885955-99b6-48e3-86a9-df98a742d1e2","Type":"ContainerStarted","Data":"f4b6b82f6733c9bde6d5d2d0b6dd2d906be16963cb45a4cf33c191674e6d1179"} Apr 16 20:58:38.684924 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.684522 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9zljx" event={"ID":"13be4f93-01b1-4633-a3f1-b9d89ab4fed8","Type":"ContainerStarted","Data":"f57e11e0a2cd0b240306562f943073b6bdfdb304a6e928be05b3e0b9e6849275"} Apr 16 20:58:38.685784 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.685764 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qdgnp" event={"ID":"86523bd2-ac21-4e5d-8cfd-81eb7aa5f405","Type":"ContainerStarted","Data":"428d5391bebbd2809e200117131c7047970707b17a20a3ba36e55d94e21264f2"} Apr 16 20:58:38.686897 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.686878 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ktmcs" event={"ID":"c26f028d-d200-4cbf-b0a5-93e063163a20","Type":"ContainerStarted","Data":"07308c815b092c7ff9eda491776d78425f71c325dd44167063e48195c4479746"} Apr 16 20:58:38.688957 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.688939 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 20:58:38.689200 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.689180 2579 generic.go:358] "Generic (PLEG): container finished" podID="6729dda1-3d66-4a8a-a99c-69840130dbf7" containerID="e56c7c1b8f61e1a3658bddcdefb6f1dda55ebbe98edbf8f26f8b8cb7fb8c3c6d" exitCode=1 Apr 16 20:58:38.689258 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.689203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"dd10c2acc7f7496e1d163c70ba5250811f0a90f4b4449d2f521fb894f3ba7fac"} Apr 16 20:58:38.689258 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.689225 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"2032f34a78326da6bec80b4940a3fa1bcce92f7961b7970c55d9feeaf9be494a"} Apr 16 20:58:38.689258 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.689234 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"9ba46509bf4d04c69239e3b107a45e5d6baa69fc98e38e572d07ade24b218c08"} Apr 16 20:58:38.689258 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.689242 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"f6ba88fd763d2cf444edbc4bd5bdc528df37c1f467376e5774a48c4dbae1a9e3"} Apr 16 20:58:38.689258 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.689251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerDied","Data":"e56c7c1b8f61e1a3658bddcdefb6f1dda55ebbe98edbf8f26f8b8cb7fb8c3c6d"} Apr 16 20:58:38.689441 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.689261 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"4db6e864abae1d67feccd36f35c6234627f7c83ec06f71caf546fc314225efa5"} Apr 16 20:58:38.690499 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.690478 2579 generic.go:358] "Generic (PLEG): container finished" podID="157d2848-acb1-4db0-bb7b-a50ad66888da" containerID="c3d3fedfa90c7f0f5f6e2ce8575023035e111ac1be4f8b6fdda2cfe34f79e73b" exitCode=0 Apr 16 20:58:38.690576 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.690539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" event={"ID":"157d2848-acb1-4db0-bb7b-a50ad66888da","Type":"ContainerDied","Data":"c3d3fedfa90c7f0f5f6e2ce8575023035e111ac1be4f8b6fdda2cfe34f79e73b"} Apr 16 20:58:38.691756 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.691677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" event={"ID":"82cec601-1fa2-462e-b238-35fcb7bcc8fe","Type":"ContainerStarted","Data":"81ac133a87128f7f29434eb75f6c7f79e96a1aac44982c9ae3d2fe9b341bf74c"} Apr 16 20:58:38.692826 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.692805 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2g5dj" event={"ID":"eb89d7f0-f4fb-459a-8f19-5b03adcf660a","Type":"ContainerStarted","Data":"51a15f02fc99694078055138fb13c5d422b1fd37d09da77ec8f63d0594776100"} Apr 16 20:58:38.722973 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.722936 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z2jsq" podStartSLOduration=4.3056588 podStartE2EDuration="21.722926507s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.085241501 +0000 UTC m=+3.103552330" lastFinishedPulling="2026-04-16 20:58:37.502509212 +0000 UTC m=+20.520820037" observedRunningTime="2026-04-16 20:58:37.702497476 +0000 UTC m=+20.720808323" watchObservedRunningTime="2026-04-16 20:58:38.722926507 +0000 UTC m=+21.741237353" Apr 16 20:58:38.723096 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.723074 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9zljx" podStartSLOduration=4.282243424 podStartE2EDuration="21.723069652s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.085607406 +0000 UTC m=+3.103918231" lastFinishedPulling="2026-04-16 20:58:37.526433623 +0000 UTC m=+20.544744459" observedRunningTime="2026-04-16 20:58:38.722720253 +0000 UTC m=+21.741031111" watchObservedRunningTime="2026-04-16 20:58:38.723069652 +0000 UTC m=+21.741380498" Apr 16 20:58:38.769688 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.769653 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qdgnp" podStartSLOduration=4.318253635 podStartE2EDuration="21.76964351s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.078427059 +0000 UTC m=+3.096737888" lastFinishedPulling="2026-04-16 20:58:37.529816938 +0000 UTC m=+20.548127763" observedRunningTime="2026-04-16 20:58:38.769518154 +0000 UTC m=+21.787829001" watchObservedRunningTime="2026-04-16 20:58:38.76964351 +0000 UTC m=+21.787954357" Apr 16 20:58:38.806836 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.806799 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2g5dj" podStartSLOduration=4.385422969 podStartE2EDuration="21.806786205s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.081449709 +0000 UTC m=+3.099760537" lastFinishedPulling="2026-04-16 20:58:37.502812946 +0000 UTC m=+20.521123773" observedRunningTime="2026-04-16 20:58:38.805249324 +0000 UTC m=+21.823560170" watchObservedRunningTime="2026-04-16 20:58:38.806786205 +0000 UTC m=+21.825097065" Apr 16 20:58:38.858945 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:38.857469 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ktmcs" podStartSLOduration=4.428943395 podStartE2EDuration="21.857453318s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.074313337 +0000 UTC m=+3.092624163" lastFinishedPulling="2026-04-16 20:58:37.502823262 +0000 UTC m=+20.521134086" observedRunningTime="2026-04-16 20:58:38.855873782 +0000 UTC m=+21.874184620" watchObservedRunningTime="2026-04-16 20:58:38.857453318 +0000 UTC m=+21.875764170" Apr 16 20:58:39.187735 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:39.187704 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:58:39.507876 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:39.507706 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:58:39.187727613Z","UUID":"0c0af7d9-ee0f-475e-abfb-b1457af96ffa","Handler":null,"Name":"","Endpoint":""} Apr 16 20:58:39.511553 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:39.511529 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:58:39.511687 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:39.511562 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:58:39.562834 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:39.562806 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:39.563020 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:39.562807 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:39.563020 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:39.562905 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:39.563104 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:39.563048 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:39.696609 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:39.696574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" event={"ID":"82cec601-1fa2-462e-b238-35fcb7bcc8fe","Type":"ContainerStarted","Data":"7082e9a02306ff61c580320f600c87a781c86d17bee537b5e62b94b5605016f4"} Apr 16 20:58:39.699401 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:39.699373 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kcrcc" event={"ID":"e692ce1d-b283-4350-8079-a219afe643ce","Type":"ContainerStarted","Data":"241146f907945544d677f64df09fc454f9525b21147fa9467232d205fd7ace41"} Apr 16 20:58:40.704922 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:40.704895 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 20:58:40.705406 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:40.705379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"97bf08ea718f6e34491436ad3832da086a0c443eb76e1d944c2fa6b9208045ea"} Apr 16 20:58:40.707390 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:40.707367 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" event={"ID":"82cec601-1fa2-462e-b238-35fcb7bcc8fe","Type":"ContainerStarted","Data":"62e6b3ab02d2ac7707a5ea053e3081aa8941ee9c74c346ba4faa83a1b004f756"} Apr 16 20:58:40.724716 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:40.724660 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kcrcc" podStartSLOduration=6.27359224 podStartE2EDuration="23.724643067s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.073283243 +0000 UTC m=+3.091594067" lastFinishedPulling="2026-04-16 20:58:37.524334064 +0000 UTC m=+20.542644894" observedRunningTime="2026-04-16 20:58:39.71496033 +0000 UTC m=+22.733271178" watchObservedRunningTime="2026-04-16 20:58:40.724643067 +0000 UTC m=+23.742953916" Apr 16 20:58:40.725044 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:40.725019 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tmn99" podStartSLOduration=3.297645261 podStartE2EDuration="23.725014103s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.081817911 +0000 UTC m=+3.100128738" lastFinishedPulling="2026-04-16 20:58:40.509186747 +0000 UTC m=+23.527497580" observedRunningTime="2026-04-16 20:58:40.724315715 +0000 UTC m=+23.742626562" watchObservedRunningTime="2026-04-16 20:58:40.725014103 +0000 UTC m=+23.743324941" Apr 16 20:58:41.562369 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:41.562151 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:41.562544 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:41.562210 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:41.562544 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:41.562468 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:41.562544 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:41.562523 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:42.707160 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:42.707130 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:42.707780 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:42.707759 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:42.856056 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:42.856035 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:42.856622 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:42.856603 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ktmcs" Apr 16 20:58:43.562434 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.562266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:43.562570 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.562266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:43.562570 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:43.562516 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:43.562646 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:43.562568 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:43.715377 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.715351 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 20:58:43.716115 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.715643 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"ce135a58dfd1ae43d4ad70dc312882983aa3d04d57aa390601b8161864f35ad2"} Apr 16 20:58:43.716115 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.715925 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:43.716115 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.715947 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:43.716115 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.715959 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:43.716277 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.716135 2579 scope.go:117] "RemoveContainer" containerID="e56c7c1b8f61e1a3658bddcdefb6f1dda55ebbe98edbf8f26f8b8cb7fb8c3c6d" Apr 16 20:58:43.717493 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.717463 2579 generic.go:358] "Generic (PLEG): container finished" podID="157d2848-acb1-4db0-bb7b-a50ad66888da" containerID="ebd3fa4c2f670b0e1e0479d86336ba308061661945301223bf594bb304d6df11" exitCode=0 Apr 16 20:58:43.717593 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.717528 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" event={"ID":"157d2848-acb1-4db0-bb7b-a50ad66888da","Type":"ContainerDied","Data":"ebd3fa4c2f670b0e1e0479d86336ba308061661945301223bf594bb304d6df11"} Apr 16 20:58:43.732754 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.732732 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:43.732957 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:43.732941 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:58:44.722888 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.722863 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 20:58:44.723270 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.723198 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" event={"ID":"6729dda1-3d66-4a8a-a99c-69840130dbf7","Type":"ContainerStarted","Data":"ce1d4f134f01b9eba80c7457582e9cdd887dc3e53bf1f690a2a419a81c883a68"} Apr 16 20:58:44.725186 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.725162 2579 generic.go:358] "Generic (PLEG): container finished" podID="157d2848-acb1-4db0-bb7b-a50ad66888da" containerID="86f87684913744adc228be2ce35fb502c0f2d27898828a9b53f4ae949613810d" exitCode=0 Apr 16 20:58:44.725302 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.725247 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" event={"ID":"157d2848-acb1-4db0-bb7b-a50ad66888da","Type":"ContainerDied","Data":"86f87684913744adc228be2ce35fb502c0f2d27898828a9b53f4ae949613810d"} Apr 16 20:58:44.752787 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.752741 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" podStartSLOduration=10.275964105 podStartE2EDuration="27.75272792s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.071353545 +0000 UTC m=+3.089664370" lastFinishedPulling="2026-04-16 20:58:37.548117351 +0000 UTC m=+20.566428185" observedRunningTime="2026-04-16 20:58:44.750935991 +0000 UTC m=+27.769246849" watchObservedRunningTime="2026-04-16 20:58:44.75272792 +0000 UTC m=+27.771038773" Apr 16 20:58:44.799771 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.799735 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z494p"] Apr 16 20:58:44.799909 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.799844 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:44.799956 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:44.799924 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:44.802541 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.802520 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rgzx9"] Apr 16 20:58:44.802641 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:44.802630 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:44.802750 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:44.802733 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:45.728712 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:45.728674 2579 generic.go:358] "Generic (PLEG): container finished" podID="157d2848-acb1-4db0-bb7b-a50ad66888da" containerID="726dee310d8d193a1cc5e273be19b4fdb6d5a64944e1281650310edb73e8e4d9" exitCode=0 Apr 16 20:58:45.729061 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:45.728760 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" event={"ID":"157d2848-acb1-4db0-bb7b-a50ad66888da","Type":"ContainerDied","Data":"726dee310d8d193a1cc5e273be19b4fdb6d5a64944e1281650310edb73e8e4d9"} Apr 16 20:58:46.562871 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:46.562834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:46.563065 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:46.562841 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:46.563065 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:46.562957 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:46.563190 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:46.563079 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:48.562367 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:48.562200 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:48.562829 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:48.562255 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:48.562829 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:48.562475 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 20:58:48.562829 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:48.562533 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z494p" podUID="b7d5e3db-123c-4df1-8f35-413bcf697e6f" Apr 16 20:58:50.254247 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.254218 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-17.ec2.internal" event="NodeReady" Apr 16 20:58:50.254816 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.254375 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:58:50.299171 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.299142 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lvrfd"] Apr 16 20:58:50.301519 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.301485 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.304252 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.304122 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:58:50.304252 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.304144 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfmp4\"" Apr 16 20:58:50.304252 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.304150 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:58:50.308783 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.308758 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hv5xj"] Apr 16 20:58:50.310709 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.310692 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:50.313556 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.313536 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lvrfd"] Apr 16 20:58:50.314524 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.314101 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:58:50.314524 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.314119 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:58:50.314524 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.314202 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mhzpf\"" Apr 16 20:58:50.314524 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.314432 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:58:50.321593 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.321571 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hv5xj"] Apr 16 20:58:50.395734 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.395694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9275ffeb-7ec4-4699-976c-7ef980230018-tmp-dir\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.395895 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.395758 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.395895 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.395799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4k87\" (UniqueName: \"kubernetes.io/projected/9275ffeb-7ec4-4699-976c-7ef980230018-kube-api-access-h4k87\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.395895 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.395852 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9275ffeb-7ec4-4699-976c-7ef980230018-config-volume\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.496562 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.496525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.496746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.496590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4k87\" (UniqueName: \"kubernetes.io/projected/9275ffeb-7ec4-4699-976c-7ef980230018-kube-api-access-h4k87\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.496746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.496618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:50.496746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.496667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9275ffeb-7ec4-4699-976c-7ef980230018-config-volume\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.496746 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.496697 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9275ffeb-7ec4-4699-976c-7ef980230018-tmp-dir\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.496746 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:50.496708 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:50.497027 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:50.496780 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls podName:9275ffeb-7ec4-4699-976c-7ef980230018 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:50.996759165 +0000 UTC m=+34.015070005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls") pod "dns-default-lvrfd" (UID: "9275ffeb-7ec4-4699-976c-7ef980230018") : secret "dns-default-metrics-tls" not found Apr 16 20:58:50.497027 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.496806 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjck\" (UniqueName: \"kubernetes.io/projected/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-kube-api-access-5cjck\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:50.497130 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.497034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9275ffeb-7ec4-4699-976c-7ef980230018-tmp-dir\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.497201 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.497185 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9275ffeb-7ec4-4699-976c-7ef980230018-config-volume\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.509829 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.509805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4k87\" (UniqueName: \"kubernetes.io/projected/9275ffeb-7ec4-4699-976c-7ef980230018-kube-api-access-h4k87\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:50.562004 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.561952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:50.562173 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.561952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:50.565364 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.565340 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:58:50.565508 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.565378 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:58:50.565579 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.565343 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c7977\"" Apr 16 20:58:50.565664 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.565647 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lvzr6\"" Apr 16 20:58:50.566564 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.566523 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:58:50.597302 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.597269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjck\" (UniqueName: \"kubernetes.io/projected/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-kube-api-access-5cjck\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:50.597458 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.597358 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:50.597509 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:50.597498 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:50.597565 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:50.597556 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert podName:ebb8fc4a-f559-45f4-be07-93cd44e25e3a nodeName:}" failed. No retries permitted until 2026-04-16 20:58:51.097537594 +0000 UTC m=+34.115848422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert") pod "ingress-canary-hv5xj" (UID: "ebb8fc4a-f559-45f4-be07-93cd44e25e3a") : secret "canary-serving-cert" not found Apr 16 20:58:50.606860 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:50.606832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjck\" (UniqueName: \"kubernetes.io/projected/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-kube-api-access-5cjck\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:51.000561 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:51.000525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:51.000731 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:51.000684 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:51.000786 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:51.000760 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls podName:9275ffeb-7ec4-4699-976c-7ef980230018 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:52.000740521 +0000 UTC m=+35.019051389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls") pod "dns-default-lvrfd" (UID: "9275ffeb-7ec4-4699-976c-7ef980230018") : secret "dns-default-metrics-tls" not found Apr 16 20:58:51.101559 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:51.101525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:51.101752 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:51.101681 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:51.101752 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:51.101743 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert podName:ebb8fc4a-f559-45f4-be07-93cd44e25e3a nodeName:}" failed. No retries permitted until 2026-04-16 20:58:52.101729515 +0000 UTC m=+35.120040340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert") pod "ingress-canary-hv5xj" (UID: "ebb8fc4a-f559-45f4-be07-93cd44e25e3a") : secret "canary-serving-cert" not found Apr 16 20:58:51.202696 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:51.202657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:58:51.202876 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:51.202782 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:58:51.202876 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:51.202843 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:23.202829044 +0000 UTC m=+66.221139888 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : secret "metrics-daemon-secret" not found Apr 16 20:58:51.304046 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:51.303946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:51.306885 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:51.306855 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsl2d\" (UniqueName: \"kubernetes.io/projected/b7d5e3db-123c-4df1-8f35-413bcf697e6f-kube-api-access-bsl2d\") pod \"network-check-target-z494p\" (UID: \"b7d5e3db-123c-4df1-8f35-413bcf697e6f\") " pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:51.479342 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:51.479307 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:52.008606 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:52.008571 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:52.008795 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:52.008739 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:52.008858 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:52.008814 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls podName:9275ffeb-7ec4-4699-976c-7ef980230018 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:54.008794703 +0000 UTC m=+37.027105529 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls") pod "dns-default-lvrfd" (UID: "9275ffeb-7ec4-4699-976c-7ef980230018") : secret "dns-default-metrics-tls" not found Apr 16 20:58:52.109241 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:52.109204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:52.109382 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:52.109317 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:52.109382 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:52.109371 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert podName:ebb8fc4a-f559-45f4-be07-93cd44e25e3a nodeName:}" failed. No retries permitted until 2026-04-16 20:58:54.109355242 +0000 UTC m=+37.127666070 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert") pod "ingress-canary-hv5xj" (UID: "ebb8fc4a-f559-45f4-be07-93cd44e25e3a") : secret "canary-serving-cert" not found Apr 16 20:58:52.315941 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:52.315797 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z494p"] Apr 16 20:58:52.370199 ip-10-0-139-17 kubenswrapper[2579]: W0416 20:58:52.370173 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7d5e3db_123c_4df1_8f35_413bcf697e6f.slice/crio-50d881ed87c91377f411f64088a7af4aec3378f5e409582aae1c378d780563ff WatchSource:0}: Error finding container 50d881ed87c91377f411f64088a7af4aec3378f5e409582aae1c378d780563ff: Status 404 returned error can't find the container with id 50d881ed87c91377f411f64088a7af4aec3378f5e409582aae1c378d780563ff Apr 16 20:58:52.744562 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:52.744533 2579 generic.go:358] "Generic (PLEG): container finished" podID="157d2848-acb1-4db0-bb7b-a50ad66888da" containerID="798ffc82c9b6956fb17036d20feec38a1769148c847933980ebaad8e03ecd3bc" exitCode=0 Apr 16 20:58:52.744657 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:52.744599 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" event={"ID":"157d2848-acb1-4db0-bb7b-a50ad66888da","Type":"ContainerDied","Data":"798ffc82c9b6956fb17036d20feec38a1769148c847933980ebaad8e03ecd3bc"} Apr 16 20:58:52.745681 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:52.745662 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z494p" event={"ID":"b7d5e3db-123c-4df1-8f35-413bcf697e6f","Type":"ContainerStarted","Data":"50d881ed87c91377f411f64088a7af4aec3378f5e409582aae1c378d780563ff"} Apr 16 20:58:53.750683 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:53.750651 2579 generic.go:358] "Generic (PLEG): container finished" podID="157d2848-acb1-4db0-bb7b-a50ad66888da" containerID="e36c9ab0e29310aa035ce3712a8205c085035e8905fb9f54a56c4beded73f7a6" exitCode=0 Apr 16 20:58:53.751150 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:53.750703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" event={"ID":"157d2848-acb1-4db0-bb7b-a50ad66888da","Type":"ContainerDied","Data":"e36c9ab0e29310aa035ce3712a8205c085035e8905fb9f54a56c4beded73f7a6"} Apr 16 20:58:54.024514 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:54.024282 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:54.024669 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:54.024546 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:54.024669 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:54.024617 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls podName:9275ffeb-7ec4-4699-976c-7ef980230018 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:58.024596687 +0000 UTC m=+41.042907530 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls") pod "dns-default-lvrfd" (UID: "9275ffeb-7ec4-4699-976c-7ef980230018") : secret "dns-default-metrics-tls" not found Apr 16 20:58:54.126254 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:54.125657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:54.126254 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:54.125789 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:54.126254 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:54.125847 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert podName:ebb8fc4a-f559-45f4-be07-93cd44e25e3a nodeName:}" failed. No retries permitted until 2026-04-16 20:58:58.125830234 +0000 UTC m=+41.144141059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert") pod "ingress-canary-hv5xj" (UID: "ebb8fc4a-f559-45f4-be07-93cd44e25e3a") : secret "canary-serving-cert" not found Apr 16 20:58:54.756475 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:54.756436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" event={"ID":"157d2848-acb1-4db0-bb7b-a50ad66888da","Type":"ContainerStarted","Data":"845f3f8756806d11b8f86fa0d5467d353f80922ce5362a5c10b6e45a163cf30a"} Apr 16 20:58:54.782242 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:54.782167 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wrfqb" podStartSLOduration=5.469374212 podStartE2EDuration="37.782149534s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:20.085654433 +0000 UTC m=+3.103965259" lastFinishedPulling="2026-04-16 20:58:52.398429756 +0000 UTC m=+35.416740581" observedRunningTime="2026-04-16 20:58:54.781192135 +0000 UTC m=+37.799502983" watchObservedRunningTime="2026-04-16 20:58:54.782149534 +0000 UTC m=+37.800460362" Apr 16 20:58:56.762211 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:56.762171 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z494p" event={"ID":"b7d5e3db-123c-4df1-8f35-413bcf697e6f","Type":"ContainerStarted","Data":"22a2f1f7ba9cce15d71f1cedc8b907b26b0225f2a4784c6dc23c00e4aa7c38e9"} Apr 16 20:58:56.762637 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:56.762301 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:58:56.778196 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:56.778151 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z494p" podStartSLOduration=36.43588142 podStartE2EDuration="39.778136482s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 20:58:52.376536718 +0000 UTC m=+35.394847545" lastFinishedPulling="2026-04-16 20:58:55.718791779 +0000 UTC m=+38.737102607" observedRunningTime="2026-04-16 20:58:56.777734374 +0000 UTC m=+39.796045218" watchObservedRunningTime="2026-04-16 20:58:56.778136482 +0000 UTC m=+39.796447317" Apr 16 20:58:58.052714 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:58.052670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:58:58.053212 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:58.052848 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:58.053212 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:58.052933 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls podName:9275ffeb-7ec4-4699-976c-7ef980230018 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:06.052912646 +0000 UTC m=+49.071223477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls") pod "dns-default-lvrfd" (UID: "9275ffeb-7ec4-4699-976c-7ef980230018") : secret "dns-default-metrics-tls" not found Apr 16 20:58:58.153521 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:58:58.153488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:58:58.153706 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:58.153642 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:58.153706 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:58:58.153705 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert podName:ebb8fc4a-f559-45f4-be07-93cd44e25e3a nodeName:}" failed. No retries permitted until 2026-04-16 20:59:06.153687723 +0000 UTC m=+49.171998562 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert") pod "ingress-canary-hv5xj" (UID: "ebb8fc4a-f559-45f4-be07-93cd44e25e3a") : secret "canary-serving-cert" not found Apr 16 20:59:06.100748 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:06.100706 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:59:06.101274 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:06.100839 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:59:06.101274 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:06.100905 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls podName:9275ffeb-7ec4-4699-976c-7ef980230018 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:22.100889486 +0000 UTC m=+65.119200310 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls") pod "dns-default-lvrfd" (UID: "9275ffeb-7ec4-4699-976c-7ef980230018") : secret "dns-default-metrics-tls" not found Apr 16 20:59:06.201292 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:06.201261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:59:06.201451 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:06.201398 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:59:06.201496 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:06.201456 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert podName:ebb8fc4a-f559-45f4-be07-93cd44e25e3a nodeName:}" failed. No retries permitted until 2026-04-16 20:59:22.20144144 +0000 UTC m=+65.219752266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert") pod "ingress-canary-hv5xj" (UID: "ebb8fc4a-f559-45f4-be07-93cd44e25e3a") : secret "canary-serving-cert" not found Apr 16 20:59:15.739065 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:15.739036 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s6cxs" Apr 16 20:59:22.108437 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:22.108400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:59:22.108874 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:22.108558 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:59:22.108874 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:22.108631 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls podName:9275ffeb-7ec4-4699-976c-7ef980230018 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:54.108612736 +0000 UTC m=+97.126923582 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls") pod "dns-default-lvrfd" (UID: "9275ffeb-7ec4-4699-976c-7ef980230018") : secret "dns-default-metrics-tls" not found Apr 16 20:59:22.209428 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:22.209395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:59:22.209562 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:22.209516 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:59:22.209601 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:22.209565 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert podName:ebb8fc4a-f559-45f4-be07-93cd44e25e3a nodeName:}" failed. No retries permitted until 2026-04-16 20:59:54.209552117 +0000 UTC m=+97.227862942 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert") pod "ingress-canary-hv5xj" (UID: "ebb8fc4a-f559-45f4-be07-93cd44e25e3a") : secret "canary-serving-cert" not found Apr 16 20:59:23.215498 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:23.215463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 20:59:23.215844 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:23.215611 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:59:23.215844 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:23.215679 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:27.215662427 +0000 UTC m=+130.233973251 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : secret "metrics-daemon-secret" not found Apr 16 20:59:27.766354 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:27.766319 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z494p" Apr 16 20:59:54.119667 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:54.119551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 20:59:54.120141 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:54.119692 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:59:54.120141 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:54.119764 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls podName:9275ffeb-7ec4-4699-976c-7ef980230018 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:58.119748054 +0000 UTC m=+161.138058878 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls") pod "dns-default-lvrfd" (UID: "9275ffeb-7ec4-4699-976c-7ef980230018") : secret "dns-default-metrics-tls" not found Apr 16 20:59:54.220839 ip-10-0-139-17 kubenswrapper[2579]: I0416 20:59:54.220801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 20:59:54.221021 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:54.220924 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:59:54.221021 ip-10-0-139-17 kubenswrapper[2579]: E0416 20:59:54.220973 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert podName:ebb8fc4a-f559-45f4-be07-93cd44e25e3a nodeName:}" failed. No retries permitted until 2026-04-16 21:00:58.220959793 +0000 UTC m=+161.239270617 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert") pod "ingress-canary-hv5xj" (UID: "ebb8fc4a-f559-45f4-be07-93cd44e25e3a") : secret "canary-serving-cert" not found Apr 16 21:00:18.831754 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.831720 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc"] Apr 16 21:00:18.834493 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.834476 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:18.838004 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.837957 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7b575bfc5d-gbhzj"] Apr 16 21:00:18.840687 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.840668 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.841038 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.840922 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 21:00:18.841891 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.841871 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 21:00:18.842010 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.841878 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 21:00:18.842010 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.841880 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-h5hv9\"" Apr 16 21:00:18.842154 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.842139 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 21:00:18.843575 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.843559 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 21:00:18.843867 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.843848 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 21:00:18.843973 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.843898 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 21:00:18.843973 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.843942 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 21:00:18.843973 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.843954 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 21:00:18.844209 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.844131 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-djwlr\"" Apr 16 21:00:18.844209 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.844131 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 21:00:18.855920 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.855901 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc"] Apr 16 21:00:18.861619 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.861599 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b575bfc5d-gbhzj"] Apr 16 21:00:18.888778 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.888753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skx2l\" (UniqueName: \"kubernetes.io/projected/b29e3035-4704-443b-ba1e-f485ad77b3c5-kube-api-access-skx2l\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:18.888935 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.888792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.888935 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.888815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b29e3035-4704-443b-ba1e-f485ad77b3c5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:18.888935 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.888832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9fvj\" (UniqueName: \"kubernetes.io/projected/9af0bfa4-fd46-49df-973a-66814186f9c6-kube-api-access-t9fvj\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.888935 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.888897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.889139 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.888958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:18.889139 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.889041 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-stats-auth\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.889139 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.889070 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-default-certificate\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.928388 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.928362 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ntt68"] Apr 16 21:00:18.931078 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.931064 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:18.933726 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.933707 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 21:00:18.934004 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.933974 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-jr57v\"" Apr 16 21:00:18.934093 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.934006 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 21:00:18.934162 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.934145 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 21:00:18.934314 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.934299 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 21:00:18.942499 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.942478 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 21:00:18.944369 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.944350 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ntt68"] Apr 16 21:00:18.990359 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990334 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcsn9\" (UniqueName: \"kubernetes.io/projected/b425b7f9-0015-4de7-81d2-02cd12eb338a-kube-api-access-lcsn9\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:18.990452 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.990452 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:18.990452 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990438 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-stats-auth\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.990551 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:18.990466 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:18.990551 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-default-certificate\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.990551 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:18.990514 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls podName:b29e3035-4704-443b-ba1e-f485ad77b3c5 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:19.490494675 +0000 UTC m=+122.508805500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j85jc" (UID: "b29e3035-4704-443b-ba1e-f485ad77b3c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:18.990657 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990551 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b425b7f9-0015-4de7-81d2-02cd12eb338a-service-ca-bundle\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:18.990657 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skx2l\" (UniqueName: \"kubernetes.io/projected/b29e3035-4704-443b-ba1e-f485ad77b3c5-kube-api-access-skx2l\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:18.990657 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:18.990603 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:19.4905841 +0000 UTC m=+122.508894934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : configmap references non-existent config key: service-ca.crt Apr 16 21:00:18.990657 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b425b7f9-0015-4de7-81d2-02cd12eb338a-serving-cert\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:18.990847 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b425b7f9-0015-4de7-81d2-02cd12eb338a-tmp\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:18.990847 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990697 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.990847 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:18.990757 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 21:00:18.990847 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b29e3035-4704-443b-ba1e-f485ad77b3c5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:18.990847 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990802 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9fvj\" (UniqueName: \"kubernetes.io/projected/9af0bfa4-fd46-49df-973a-66814186f9c6-kube-api-access-t9fvj\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.990847 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:18.990807 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:19.490793045 +0000 UTC m=+122.509103871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : secret "router-metrics-certs-default" not found Apr 16 21:00:18.990847 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b425b7f9-0015-4de7-81d2-02cd12eb338a-snapshots\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:18.991122 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.990872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b425b7f9-0015-4de7-81d2-02cd12eb338a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:18.991460 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.991443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b29e3035-4704-443b-ba1e-f485ad77b3c5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:18.994316 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.994293 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-stats-auth\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:18.994430 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:18.994414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-default-certificate\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:19.002902 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.002882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skx2l\" (UniqueName: \"kubernetes.io/projected/b29e3035-4704-443b-ba1e-f485ad77b3c5-kube-api-access-skx2l\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:19.002978 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.002939 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9fvj\" (UniqueName: \"kubernetes.io/projected/9af0bfa4-fd46-49df-973a-66814186f9c6-kube-api-access-t9fvj\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:19.091897 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.091840 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b425b7f9-0015-4de7-81d2-02cd12eb338a-service-ca-bundle\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.091897 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.091868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b425b7f9-0015-4de7-81d2-02cd12eb338a-serving-cert\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.091897 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.091892 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b425b7f9-0015-4de7-81d2-02cd12eb338a-tmp\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.092145 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.091926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b425b7f9-0015-4de7-81d2-02cd12eb338a-snapshots\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.092145 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.091952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b425b7f9-0015-4de7-81d2-02cd12eb338a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.092145 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.091984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcsn9\" (UniqueName: \"kubernetes.io/projected/b425b7f9-0015-4de7-81d2-02cd12eb338a-kube-api-access-lcsn9\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.092371 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.092348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b425b7f9-0015-4de7-81d2-02cd12eb338a-tmp\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.092520 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.092499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b425b7f9-0015-4de7-81d2-02cd12eb338a-snapshots\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.092578 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.092522 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b425b7f9-0015-4de7-81d2-02cd12eb338a-service-ca-bundle\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.092831 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.092814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b425b7f9-0015-4de7-81d2-02cd12eb338a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.094218 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.094202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b425b7f9-0015-4de7-81d2-02cd12eb338a-serving-cert\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.100707 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.100688 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcsn9\" (UniqueName: \"kubernetes.io/projected/b425b7f9-0015-4de7-81d2-02cd12eb338a-kube-api-access-lcsn9\") pod \"insights-operator-585dfdc468-ntt68\" (UID: \"b425b7f9-0015-4de7-81d2-02cd12eb338a\") " pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.240009 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.239959 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-ntt68" Apr 16 21:00:19.368315 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.368240 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ntt68"] Apr 16 21:00:19.371436 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:19.371408 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb425b7f9_0015_4de7_81d2_02cd12eb338a.slice/crio-75531edf967e20abc674bc10ef41259fb03ee804d8d0be60e06e567940e8df4e WatchSource:0}: Error finding container 75531edf967e20abc674bc10ef41259fb03ee804d8d0be60e06e567940e8df4e: Status 404 returned error can't find the container with id 75531edf967e20abc674bc10ef41259fb03ee804d8d0be60e06e567940e8df4e Apr 16 21:00:19.499906 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.499875 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:19.499906 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.499918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:19.500187 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.500024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:19.500187 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:19.500069 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:20.500050163 +0000 UTC m=+123.518360988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : configmap references non-existent config key: service-ca.crt Apr 16 21:00:19.500187 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:19.500086 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:19.500187 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:19.500115 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 21:00:19.500187 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:19.500159 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls podName:b29e3035-4704-443b-ba1e-f485ad77b3c5 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:20.500146983 +0000 UTC m=+123.518457812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j85jc" (UID: "b29e3035-4704-443b-ba1e-f485ad77b3c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:19.500187 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:19.500177 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:20.500167911 +0000 UTC m=+123.518478736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : secret "router-metrics-certs-default" not found Apr 16 21:00:19.917631 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:19.917594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ntt68" event={"ID":"b425b7f9-0015-4de7-81d2-02cd12eb338a","Type":"ContainerStarted","Data":"75531edf967e20abc674bc10ef41259fb03ee804d8d0be60e06e567940e8df4e"} Apr 16 21:00:20.507006 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:20.506961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:20.507194 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:20.507020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:20.507194 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:20.507099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:20.507194 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:20.507153 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:22.507127995 +0000 UTC m=+125.525438839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : configmap references non-existent config key: service-ca.crt Apr 16 21:00:20.507361 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:20.507200 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:20.507361 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:20.507268 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls podName:b29e3035-4704-443b-ba1e-f485ad77b3c5 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:22.507249788 +0000 UTC m=+125.525560625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j85jc" (UID: "b29e3035-4704-443b-ba1e-f485ad77b3c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:20.507361 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:20.507201 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 21:00:20.507361 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:20.507314 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:22.507303496 +0000 UTC m=+125.525614324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : secret "router-metrics-certs-default" not found Apr 16 21:00:21.923107 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:21.923073 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ntt68" event={"ID":"b425b7f9-0015-4de7-81d2-02cd12eb338a","Type":"ContainerStarted","Data":"f026dd099b6d39005a50670b2bfdcbe547635a39f259c8c184058cd40eb4a931"} Apr 16 21:00:21.941133 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:21.941091 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-ntt68" podStartSLOduration=2.042663877 podStartE2EDuration="3.941079503s" podCreationTimestamp="2026-04-16 21:00:18 +0000 UTC" firstStartedPulling="2026-04-16 21:00:19.37313468 +0000 UTC m=+122.391445506" lastFinishedPulling="2026-04-16 21:00:21.271550305 +0000 UTC m=+124.289861132" observedRunningTime="2026-04-16 21:00:21.939947166 +0000 UTC m=+124.958258012" watchObservedRunningTime="2026-04-16 21:00:21.941079503 +0000 UTC m=+124.959390348" Apr 16 21:00:22.523973 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:22.523937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:22.524167 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:22.524009 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:22.524167 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:22.524042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:22.524167 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:22.524121 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 21:00:22.524167 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:22.524138 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:22.524167 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:22.524140 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:26.524121099 +0000 UTC m=+129.542431943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : configmap references non-existent config key: service-ca.crt Apr 16 21:00:22.524345 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:22.524180 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls podName:b29e3035-4704-443b-ba1e-f485ad77b3c5 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:26.524170624 +0000 UTC m=+129.542481454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j85jc" (UID: "b29e3035-4704-443b-ba1e-f485ad77b3c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:22.524345 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:22.524201 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:26.524192531 +0000 UTC m=+129.542503363 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : secret "router-metrics-certs-default" not found Apr 16 21:00:23.875035 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:23.874981 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx"] Apr 16 21:00:23.877884 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:23.877867 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:23.880580 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:23.880543 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 21:00:23.880580 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:23.880554 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:00:23.880715 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:23.880705 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-s9bsv\"" Apr 16 21:00:23.881743 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:23.881725 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 21:00:23.881890 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:23.881873 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 21:00:23.889406 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:23.889372 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx"] Apr 16 21:00:24.034545 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.034511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acccf14f-1908-45d6-a352-e0a6c7fc6a05-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.034545 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.034548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acccf14f-1908-45d6-a352-e0a6c7fc6a05-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.034828 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.034566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8tt\" (UniqueName: \"kubernetes.io/projected/acccf14f-1908-45d6-a352-e0a6c7fc6a05-kube-api-access-xt8tt\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.135811 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.135721 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acccf14f-1908-45d6-a352-e0a6c7fc6a05-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.135811 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.135771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acccf14f-1908-45d6-a352-e0a6c7fc6a05-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.135811 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.135794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8tt\" (UniqueName: \"kubernetes.io/projected/acccf14f-1908-45d6-a352-e0a6c7fc6a05-kube-api-access-xt8tt\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.136276 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.136256 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acccf14f-1908-45d6-a352-e0a6c7fc6a05-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.138166 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.138147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acccf14f-1908-45d6-a352-e0a6c7fc6a05-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.144300 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.144278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8tt\" (UniqueName: \"kubernetes.io/projected/acccf14f-1908-45d6-a352-e0a6c7fc6a05-kube-api-access-xt8tt\") pod \"kube-storage-version-migrator-operator-6769c5d45-p9hsx\" (UID: \"acccf14f-1908-45d6-a352-e0a6c7fc6a05\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.186482 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.186460 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" Apr 16 21:00:24.296921 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.296893 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx"] Apr 16 21:00:24.299956 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:24.299925 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacccf14f_1908_45d6_a352_e0a6c7fc6a05.slice/crio-5c5a37fe1066fe7d6b11aa195bd4932f2a0f87b6892e440d3d70d944e7b07957 WatchSource:0}: Error finding container 5c5a37fe1066fe7d6b11aa195bd4932f2a0f87b6892e440d3d70d944e7b07957: Status 404 returned error can't find the container with id 5c5a37fe1066fe7d6b11aa195bd4932f2a0f87b6892e440d3d70d944e7b07957 Apr 16 21:00:24.681517 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.681489 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qdgnp_86523bd2-ac21-4e5d-8cfd-81eb7aa5f405/dns-node-resolver/0.log" Apr 16 21:00:24.929243 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:24.929212 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" event={"ID":"acccf14f-1908-45d6-a352-e0a6c7fc6a05","Type":"ContainerStarted","Data":"5c5a37fe1066fe7d6b11aa195bd4932f2a0f87b6892e440d3d70d944e7b07957"} Apr 16 21:00:25.480582 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.480555 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2g5dj_eb89d7f0-f4fb-459a-8f19-5b03adcf660a/node-ca/0.log" Apr 16 21:00:25.858668 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.858580 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx"] Apr 16 21:00:25.861877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.861852 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:25.864720 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.864694 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 21:00:25.864907 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.864725 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 21:00:25.864907 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.864785 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 21:00:25.864907 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.864816 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:00:25.865824 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.865808 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bmdrp\"" Apr 16 21:00:25.869139 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.869104 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx"] Apr 16 21:00:25.952215 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.952180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-config\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:25.952648 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.952276 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:25.952648 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:25.952361 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h55g\" (UniqueName: \"kubernetes.io/projected/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-kube-api-access-2h55g\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:26.053158 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.053116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:26.053331 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.053211 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h55g\" (UniqueName: \"kubernetes.io/projected/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-kube-api-access-2h55g\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:26.053331 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.053268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-config\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:26.054483 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.054454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-config\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:26.055906 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.055882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:26.065072 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.065052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h55g\" (UniqueName: \"kubernetes.io/projected/5a07ff8c-d53e-4fcd-93b5-f85c40522a10-kube-api-access-2h55g\") pod \"service-ca-operator-d6fc45fc5-fvhtx\" (UID: \"5a07ff8c-d53e-4fcd-93b5-f85c40522a10\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:26.173039 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.172950 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" Apr 16 21:00:26.296713 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.296683 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx"] Apr 16 21:00:26.299345 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:26.299315 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a07ff8c_d53e_4fcd_93b5_f85c40522a10.slice/crio-57fad0b2b9591dfe979fd9635d53671086c9bd8932d4c32eb7a8cbfe5a4ec25b WatchSource:0}: Error finding container 57fad0b2b9591dfe979fd9635d53671086c9bd8932d4c32eb7a8cbfe5a4ec25b: Status 404 returned error can't find the container with id 57fad0b2b9591dfe979fd9635d53671086c9bd8932d4c32eb7a8cbfe5a4ec25b Apr 16 21:00:26.556895 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.556864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:26.557083 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.556914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:26.557083 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.556932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:26.557083 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:26.557039 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 21:00:26.557083 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:26.557045 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:26.557083 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:26.557077 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:34.557057016 +0000 UTC m=+137.575367844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : configmap references non-existent config key: service-ca.crt Apr 16 21:00:26.557276 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:26.557098 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls podName:b29e3035-4704-443b-ba1e-f485ad77b3c5 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:34.557086795 +0000 UTC m=+137.575397619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j85jc" (UID: "b29e3035-4704-443b-ba1e-f485ad77b3c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:26.557276 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:26.557110 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:34.557104242 +0000 UTC m=+137.575415067 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : secret "router-metrics-certs-default" not found Apr 16 21:00:26.934837 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.934756 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" event={"ID":"acccf14f-1908-45d6-a352-e0a6c7fc6a05","Type":"ContainerStarted","Data":"169fa441efd96669e086a2acf07194982a09e2f9f710eeb4bb109406a26423b4"} Apr 16 21:00:26.935905 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.935879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" event={"ID":"5a07ff8c-d53e-4fcd-93b5-f85c40522a10","Type":"ContainerStarted","Data":"57fad0b2b9591dfe979fd9635d53671086c9bd8932d4c32eb7a8cbfe5a4ec25b"} Apr 16 21:00:26.951874 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:26.951830 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" podStartSLOduration=2.067897012 podStartE2EDuration="3.951815597s" podCreationTimestamp="2026-04-16 21:00:23 +0000 UTC" firstStartedPulling="2026-04-16 21:00:24.301755413 +0000 UTC m=+127.320066239" lastFinishedPulling="2026-04-16 21:00:26.185673984 +0000 UTC m=+129.203984824" observedRunningTime="2026-04-16 21:00:26.950613907 +0000 UTC m=+129.968924762" watchObservedRunningTime="2026-04-16 21:00:26.951815597 +0000 UTC m=+129.970126444" Apr 16 21:00:27.246005 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.245954 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc"] Apr 16 21:00:27.249054 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.249031 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc" Apr 16 21:00:27.251635 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.251613 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-cpf7w\"" Apr 16 21:00:27.260725 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.258190 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc"] Apr 16 21:00:27.263202 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.263180 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 21:00:27.263326 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:27.263277 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 21:00:27.263387 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:27.263329 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs podName:24ed89ef-c93c-40fd-a75f-2f3fd7582359 nodeName:}" failed. No retries permitted until 2026-04-16 21:02:29.263312947 +0000 UTC m=+252.281623773 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs") pod "network-metrics-daemon-rgzx9" (UID: "24ed89ef-c93c-40fd-a75f-2f3fd7582359") : secret "metrics-daemon-secret" not found Apr 16 21:00:27.363686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.363632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrxzq\" (UniqueName: \"kubernetes.io/projected/fb6d0774-a65e-4cdd-b5e2-ea3e9e841e5f-kube-api-access-jrxzq\") pod \"network-check-source-8894fc9bd-jvjcc\" (UID: \"fb6d0774-a65e-4cdd-b5e2-ea3e9e841e5f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc" Apr 16 21:00:27.464784 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.464742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrxzq\" (UniqueName: \"kubernetes.io/projected/fb6d0774-a65e-4cdd-b5e2-ea3e9e841e5f-kube-api-access-jrxzq\") pod \"network-check-source-8894fc9bd-jvjcc\" (UID: \"fb6d0774-a65e-4cdd-b5e2-ea3e9e841e5f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc" Apr 16 21:00:27.474424 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.474379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrxzq\" (UniqueName: \"kubernetes.io/projected/fb6d0774-a65e-4cdd-b5e2-ea3e9e841e5f-kube-api-access-jrxzq\") pod \"network-check-source-8894fc9bd-jvjcc\" (UID: \"fb6d0774-a65e-4cdd-b5e2-ea3e9e841e5f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc" Apr 16 21:00:27.561926 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.561835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc" Apr 16 21:00:27.681331 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.681301 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc"] Apr 16 21:00:27.685982 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:27.685947 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb6d0774_a65e_4cdd_b5e2_ea3e9e841e5f.slice/crio-539531c4ebfea09a80d8b2cabd604c04b2671db3aac63cc852a0a50e45cfb441 WatchSource:0}: Error finding container 539531c4ebfea09a80d8b2cabd604c04b2671db3aac63cc852a0a50e45cfb441: Status 404 returned error can't find the container with id 539531c4ebfea09a80d8b2cabd604c04b2671db3aac63cc852a0a50e45cfb441 Apr 16 21:00:27.939239 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.939145 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc" event={"ID":"fb6d0774-a65e-4cdd-b5e2-ea3e9e841e5f","Type":"ContainerStarted","Data":"5bd90a8706f3b3bc0f913d5719cfe9d56f54a1647c4768e6eeaef5a8e8777c45"} Apr 16 21:00:27.939239 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.939185 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc" event={"ID":"fb6d0774-a65e-4cdd-b5e2-ea3e9e841e5f","Type":"ContainerStarted","Data":"539531c4ebfea09a80d8b2cabd604c04b2671db3aac63cc852a0a50e45cfb441"} Apr 16 21:00:27.957625 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:27.957563 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jvjcc" podStartSLOduration=0.957548359 podStartE2EDuration="957.548359ms" podCreationTimestamp="2026-04-16 21:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:00:27.956481327 +0000 UTC m=+130.974792175" watchObservedRunningTime="2026-04-16 21:00:27.957548359 +0000 UTC m=+130.975859206" Apr 16 21:00:28.943393 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:28.943308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" event={"ID":"5a07ff8c-d53e-4fcd-93b5-f85c40522a10","Type":"ContainerStarted","Data":"b2156ea50db98952a596d597243705e580eb5cd095f9a8c73b0720466fd6fe22"} Apr 16 21:00:28.975473 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:28.975428 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" podStartSLOduration=1.651762118 podStartE2EDuration="3.975414062s" podCreationTimestamp="2026-04-16 21:00:25 +0000 UTC" firstStartedPulling="2026-04-16 21:00:26.301150013 +0000 UTC m=+129.319460842" lastFinishedPulling="2026-04-16 21:00:28.624801958 +0000 UTC m=+131.643112786" observedRunningTime="2026-04-16 21:00:28.973912787 +0000 UTC m=+131.992223633" watchObservedRunningTime="2026-04-16 21:00:28.975414062 +0000 UTC m=+131.993724938" Apr 16 21:00:32.416523 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.416488 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lzmxg"] Apr 16 21:00:32.419432 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.419414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.423853 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.423833 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 21:00:32.423923 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.423839 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 21:00:32.423960 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.423949 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 21:00:32.425184 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.425169 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 21:00:32.425740 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.425572 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lhr86\"" Apr 16 21:00:32.428286 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.428265 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lzmxg"] Apr 16 21:00:32.508221 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.508188 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/08f83bac-6851-4477-b728-3ad83bff3c69-signing-cabundle\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.508221 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.508221 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/08f83bac-6851-4477-b728-3ad83bff3c69-signing-key\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.508393 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.508242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b9s9\" (UniqueName: \"kubernetes.io/projected/08f83bac-6851-4477-b728-3ad83bff3c69-kube-api-access-5b9s9\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.608800 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.608765 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/08f83bac-6851-4477-b728-3ad83bff3c69-signing-cabundle\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.608800 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.608800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/08f83bac-6851-4477-b728-3ad83bff3c69-signing-key\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.609067 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.608821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b9s9\" (UniqueName: \"kubernetes.io/projected/08f83bac-6851-4477-b728-3ad83bff3c69-kube-api-access-5b9s9\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.609411 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.609392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/08f83bac-6851-4477-b728-3ad83bff3c69-signing-cabundle\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.611382 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.611360 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/08f83bac-6851-4477-b728-3ad83bff3c69-signing-key\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.617230 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.617211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b9s9\" (UniqueName: \"kubernetes.io/projected/08f83bac-6851-4477-b728-3ad83bff3c69-kube-api-access-5b9s9\") pod \"service-ca-865cb79987-lzmxg\" (UID: \"08f83bac-6851-4477-b728-3ad83bff3c69\") " pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.728933 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.728898 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-lzmxg" Apr 16 21:00:32.846068 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.846038 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lzmxg"] Apr 16 21:00:32.849736 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:32.849711 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08f83bac_6851_4477_b728_3ad83bff3c69.slice/crio-713e8e21b59d60adfbfe89db2b9e0be3bd312503678847bc95e93387c826bc6e WatchSource:0}: Error finding container 713e8e21b59d60adfbfe89db2b9e0be3bd312503678847bc95e93387c826bc6e: Status 404 returned error can't find the container with id 713e8e21b59d60adfbfe89db2b9e0be3bd312503678847bc95e93387c826bc6e Apr 16 21:00:32.954043 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.953978 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-lzmxg" event={"ID":"08f83bac-6851-4477-b728-3ad83bff3c69","Type":"ContainerStarted","Data":"d22c804b4799df8315cc330fc146f6febaef174ec861227c6e0948b0335d1356"} Apr 16 21:00:32.954043 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.954047 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-lzmxg" event={"ID":"08f83bac-6851-4477-b728-3ad83bff3c69","Type":"ContainerStarted","Data":"713e8e21b59d60adfbfe89db2b9e0be3bd312503678847bc95e93387c826bc6e"} Apr 16 21:00:32.977561 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:32.977500 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-lzmxg" podStartSLOduration=0.977486019 podStartE2EDuration="977.486019ms" podCreationTimestamp="2026-04-16 21:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:00:32.975883089 +0000 UTC m=+135.994193935" watchObservedRunningTime="2026-04-16 21:00:32.977486019 +0000 UTC m=+135.995796890" Apr 16 21:00:34.625727 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:34.625688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:34.626194 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:34.625759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:34.626194 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:34.625801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:34.626194 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:34.625862 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:50.625842906 +0000 UTC m=+153.644153762 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : configmap references non-existent config key: service-ca.crt Apr 16 21:00:34.626194 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:34.625870 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 21:00:34.626194 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:34.625918 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs podName:9af0bfa4-fd46-49df-973a-66814186f9c6 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:50.62590666 +0000 UTC m=+153.644217504 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs") pod "router-default-7b575bfc5d-gbhzj" (UID: "9af0bfa4-fd46-49df-973a-66814186f9c6") : secret "router-metrics-certs-default" not found Apr 16 21:00:34.626194 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:34.625937 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:34.626194 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:34.626009 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls podName:b29e3035-4704-443b-ba1e-f485ad77b3c5 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:50.625973361 +0000 UTC m=+153.644284186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-j85jc" (UID: "b29e3035-4704-443b-ba1e-f485ad77b3c5") : secret "cluster-monitoring-operator-tls" not found Apr 16 21:00:50.649885 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.649845 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:50.650374 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.649920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:50.650374 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.649956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:50.650524 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.650503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af0bfa4-fd46-49df-973a-66814186f9c6-service-ca-bundle\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:50.652422 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.652404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9af0bfa4-fd46-49df-973a-66814186f9c6-metrics-certs\") pod \"router-default-7b575bfc5d-gbhzj\" (UID: \"9af0bfa4-fd46-49df-973a-66814186f9c6\") " pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:50.652469 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.652405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29e3035-4704-443b-ba1e-f485ad77b3c5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-j85jc\" (UID: \"b29e3035-4704-443b-ba1e-f485ad77b3c5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:50.847531 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.847500 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v4wf2"] Apr 16 21:00:50.880459 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.880431 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v4wf2"] Apr 16 21:00:50.880635 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.880579 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:50.884444 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.884426 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b7pz7\"" Apr 16 21:00:50.885564 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.885550 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 21:00:50.885625 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.885554 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 21:00:50.943402 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.943341 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" Apr 16 21:00:50.950071 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.950046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:50.952966 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.952940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rgn9\" (UniqueName: \"kubernetes.io/projected/0f4f5f3f-e462-4f22-8504-2743600ff619-kube-api-access-9rgn9\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:50.953135 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.953009 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f4f5f3f-e462-4f22-8504-2743600ff619-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:50.953135 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.953042 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f4f5f3f-e462-4f22-8504-2743600ff619-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:50.953135 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.953069 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f4f5f3f-e462-4f22-8504-2743600ff619-crio-socket\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:50.953135 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.953127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f4f5f3f-e462-4f22-8504-2743600ff619-data-volume\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:50.957411 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.957371 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-md9h2"] Apr 16 21:00:50.986355 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.986250 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-md9h2" Apr 16 21:00:50.990226 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.990147 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-9djwz\"" Apr 16 21:00:50.990394 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.990290 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 21:00:50.990471 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.990397 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 21:00:50.994942 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:50.994884 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-md9h2"] Apr 16 21:00:51.055025 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.054392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56pnv\" (UniqueName: \"kubernetes.io/projected/d4fdbd49-f42a-49e0-befb-6a581b41d609-kube-api-access-56pnv\") pod \"downloads-6bcc868b7-md9h2\" (UID: \"d4fdbd49-f42a-49e0-befb-6a581b41d609\") " pod="openshift-console/downloads-6bcc868b7-md9h2" Apr 16 21:00:51.055025 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.054449 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rgn9\" (UniqueName: \"kubernetes.io/projected/0f4f5f3f-e462-4f22-8504-2743600ff619-kube-api-access-9rgn9\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.055025 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.054473 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f4f5f3f-e462-4f22-8504-2743600ff619-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.055025 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.054490 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f4f5f3f-e462-4f22-8504-2743600ff619-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.055025 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.054506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f4f5f3f-e462-4f22-8504-2743600ff619-crio-socket\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.055025 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.054538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f4f5f3f-e462-4f22-8504-2743600ff619-data-volume\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.055383 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.055134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f4f5f3f-e462-4f22-8504-2743600ff619-crio-socket\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.055608 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.055582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f4f5f3f-e462-4f22-8504-2743600ff619-data-volume\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.057438 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.057417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f4f5f3f-e462-4f22-8504-2743600ff619-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.068196 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.068177 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f4f5f3f-e462-4f22-8504-2743600ff619-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.070023 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.069982 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rgn9\" (UniqueName: \"kubernetes.io/projected/0f4f5f3f-e462-4f22-8504-2743600ff619-kube-api-access-9rgn9\") pod \"insights-runtime-extractor-v4wf2\" (UID: \"0f4f5f3f-e462-4f22-8504-2743600ff619\") " pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.081135 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.081117 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc"] Apr 16 21:00:51.084976 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:51.084950 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb29e3035_4704_443b_ba1e_f485ad77b3c5.slice/crio-f893fbdfc11125dfa78c6fac1f1f38d8454123ab398af3e603d4faeb9b4c81e4 WatchSource:0}: Error finding container f893fbdfc11125dfa78c6fac1f1f38d8454123ab398af3e603d4faeb9b4c81e4: Status 404 returned error can't find the container with id f893fbdfc11125dfa78c6fac1f1f38d8454123ab398af3e603d4faeb9b4c81e4 Apr 16 21:00:51.111882 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.111859 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b575bfc5d-gbhzj"] Apr 16 21:00:51.114253 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:51.114228 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af0bfa4_fd46_49df_973a_66814186f9c6.slice/crio-6f98de808a92420bebf1ce09098dfe6258efe21a56f178c1eafefda0e025ff41 WatchSource:0}: Error finding container 6f98de808a92420bebf1ce09098dfe6258efe21a56f178c1eafefda0e025ff41: Status 404 returned error can't find the container with id 6f98de808a92420bebf1ce09098dfe6258efe21a56f178c1eafefda0e025ff41 Apr 16 21:00:51.155363 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.155340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56pnv\" (UniqueName: \"kubernetes.io/projected/d4fdbd49-f42a-49e0-befb-6a581b41d609-kube-api-access-56pnv\") pod \"downloads-6bcc868b7-md9h2\" (UID: \"d4fdbd49-f42a-49e0-befb-6a581b41d609\") " pod="openshift-console/downloads-6bcc868b7-md9h2" Apr 16 21:00:51.167191 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.167168 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56pnv\" (UniqueName: \"kubernetes.io/projected/d4fdbd49-f42a-49e0-befb-6a581b41d609-kube-api-access-56pnv\") pod \"downloads-6bcc868b7-md9h2\" (UID: \"d4fdbd49-f42a-49e0-befb-6a581b41d609\") " pod="openshift-console/downloads-6bcc868b7-md9h2" Apr 16 21:00:51.189942 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.189923 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v4wf2" Apr 16 21:00:51.302544 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.302518 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-md9h2" Apr 16 21:00:51.313896 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.313867 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v4wf2"] Apr 16 21:00:51.316962 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:51.316931 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f4f5f3f_e462_4f22_8504_2743600ff619.slice/crio-d71b37796ec7f323b7793262fc84b446271d819c51b41b89c4af68ee974fd659 WatchSource:0}: Error finding container d71b37796ec7f323b7793262fc84b446271d819c51b41b89c4af68ee974fd659: Status 404 returned error can't find the container with id d71b37796ec7f323b7793262fc84b446271d819c51b41b89c4af68ee974fd659 Apr 16 21:00:51.429929 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:51.429889 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-md9h2"] Apr 16 21:00:51.433423 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:51.433398 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4fdbd49_f42a_49e0_befb_6a581b41d609.slice/crio-2f57ab7a7617b77d2fbaee5eb39b829aecaa8a0a8bbf88105833649551d638a2 WatchSource:0}: Error finding container 2f57ab7a7617b77d2fbaee5eb39b829aecaa8a0a8bbf88105833649551d638a2: Status 404 returned error can't find the container with id 2f57ab7a7617b77d2fbaee5eb39b829aecaa8a0a8bbf88105833649551d638a2 Apr 16 21:00:52.006444 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.006332 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-md9h2" event={"ID":"d4fdbd49-f42a-49e0-befb-6a581b41d609","Type":"ContainerStarted","Data":"2f57ab7a7617b77d2fbaee5eb39b829aecaa8a0a8bbf88105833649551d638a2"} Apr 16 21:00:52.007683 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.007649 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" event={"ID":"b29e3035-4704-443b-ba1e-f485ad77b3c5","Type":"ContainerStarted","Data":"f893fbdfc11125dfa78c6fac1f1f38d8454123ab398af3e603d4faeb9b4c81e4"} Apr 16 21:00:52.010121 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.009799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" event={"ID":"9af0bfa4-fd46-49df-973a-66814186f9c6","Type":"ContainerStarted","Data":"46153d00707d047cd84a3224aab6f38f9ade4a2f26859530b612237723ff3541"} Apr 16 21:00:52.010121 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.009833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" event={"ID":"9af0bfa4-fd46-49df-973a-66814186f9c6","Type":"ContainerStarted","Data":"6f98de808a92420bebf1ce09098dfe6258efe21a56f178c1eafefda0e025ff41"} Apr 16 21:00:52.011598 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.011573 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v4wf2" event={"ID":"0f4f5f3f-e462-4f22-8504-2743600ff619","Type":"ContainerStarted","Data":"08775ce1207461a6492b3bf7d14c321ed6379efe74591cf58223d5ea166ac78f"} Apr 16 21:00:52.011704 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.011606 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v4wf2" event={"ID":"0f4f5f3f-e462-4f22-8504-2743600ff619","Type":"ContainerStarted","Data":"d71b37796ec7f323b7793262fc84b446271d819c51b41b89c4af68ee974fd659"} Apr 16 21:00:52.032749 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.032694 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" podStartSLOduration=34.032677098 podStartE2EDuration="34.032677098s" podCreationTimestamp="2026-04-16 21:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:00:52.032127889 +0000 UTC m=+155.050438739" watchObservedRunningTime="2026-04-16 21:00:52.032677098 +0000 UTC m=+155.050987947" Apr 16 21:00:52.633794 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.633707 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b5cdf7dfd-wsszn"] Apr 16 21:00:52.638506 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.638481 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.641570 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.641536 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 21:00:52.643032 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.642979 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 21:00:52.643276 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.643250 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 21:00:52.643495 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.643477 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 21:00:52.643729 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.643712 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 21:00:52.643975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.643955 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vpj6c\"" Apr 16 21:00:52.650867 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.650843 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b5cdf7dfd-wsszn"] Apr 16 21:00:52.673592 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.673563 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-oauth-config\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.673749 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.673622 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-serving-cert\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.673749 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.673646 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-config\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.673749 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.673697 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-oauth-serving-cert\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.673900 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.673748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd2f8\" (UniqueName: \"kubernetes.io/projected/1d4ba783-d587-4d20-b3e3-33e1b55302c3-kube-api-access-xd2f8\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.673900 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.673782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-service-ca\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.774755 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.774721 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-oauth-config\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.774950 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.774786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-serving-cert\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.774950 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.774820 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-config\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.774950 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.774856 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-oauth-serving-cert\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.774950 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.774912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd2f8\" (UniqueName: \"kubernetes.io/projected/1d4ba783-d587-4d20-b3e3-33e1b55302c3-kube-api-access-xd2f8\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.774950 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.774947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-service-ca\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.776476 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.776446 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-oauth-serving-cert\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.777554 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.777503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-service-ca\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.777674 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.777655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-config\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.777874 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.777850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-serving-cert\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.777955 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.777938 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-oauth-config\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.788092 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.788066 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd2f8\" (UniqueName: \"kubernetes.io/projected/1d4ba783-d587-4d20-b3e3-33e1b55302c3-kube-api-access-xd2f8\") pod \"console-7b5cdf7dfd-wsszn\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:52.950608 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.950522 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:52.953503 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.953479 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:52.954497 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:52.954472 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:00:53.016341 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:53.016287 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v4wf2" event={"ID":"0f4f5f3f-e462-4f22-8504-2743600ff619","Type":"ContainerStarted","Data":"4c463b302fd82b853f75e933bd5b5806713b5d780c11a25a7620e46654727222"} Apr 16 21:00:53.016806 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:53.016787 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:53.017976 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:53.017954 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7b575bfc5d-gbhzj" Apr 16 21:00:53.283247 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:53.283196 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b5cdf7dfd-wsszn"] Apr 16 21:00:53.287794 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:53.287763 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d4ba783_d587_4d20_b3e3_33e1b55302c3.slice/crio-9bd97b143bd552ec7e59d38dc92865bcbed48dfb067d57bfab12257dd22796e9 WatchSource:0}: Error finding container 9bd97b143bd552ec7e59d38dc92865bcbed48dfb067d57bfab12257dd22796e9: Status 404 returned error can't find the container with id 9bd97b143bd552ec7e59d38dc92865bcbed48dfb067d57bfab12257dd22796e9 Apr 16 21:00:53.312937 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:53.312900 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lvrfd" podUID="9275ffeb-7ec4-4699-976c-7ef980230018" Apr 16 21:00:53.322134 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:53.322101 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hv5xj" podUID="ebb8fc4a-f559-45f4-be07-93cd44e25e3a" Apr 16 21:00:53.574089 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:00:53.574047 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rgzx9" podUID="24ed89ef-c93c-40fd-a75f-2f3fd7582359" Apr 16 21:00:54.021011 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:54.020961 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" event={"ID":"b29e3035-4704-443b-ba1e-f485ad77b3c5","Type":"ContainerStarted","Data":"1d97937e322da4cdfd46775d19e111d64ad654e8e7ae2b485d10a3bd1b4384de"} Apr 16 21:00:54.023553 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:54.023525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5cdf7dfd-wsszn" event={"ID":"1d4ba783-d587-4d20-b3e3-33e1b55302c3","Type":"ContainerStarted","Data":"9bd97b143bd552ec7e59d38dc92865bcbed48dfb067d57bfab12257dd22796e9"} Apr 16 21:00:54.023682 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:54.023605 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lvrfd" Apr 16 21:00:54.023895 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:54.023867 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 21:00:54.039443 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:54.039373 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-j85jc" podStartSLOduration=33.939011582 podStartE2EDuration="36.039357389s" podCreationTimestamp="2026-04-16 21:00:18 +0000 UTC" firstStartedPulling="2026-04-16 21:00:51.086775882 +0000 UTC m=+154.105086708" lastFinishedPulling="2026-04-16 21:00:53.187121674 +0000 UTC m=+156.205432515" observedRunningTime="2026-04-16 21:00:54.039064156 +0000 UTC m=+157.057375004" watchObservedRunningTime="2026-04-16 21:00:54.039357389 +0000 UTC m=+157.057668239" Apr 16 21:00:55.028667 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:55.028631 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v4wf2" event={"ID":"0f4f5f3f-e462-4f22-8504-2743600ff619","Type":"ContainerStarted","Data":"3eb278bad6f707b6d7d303978c79abc821e3b993f4de34cb1b095067eb362402"} Apr 16 21:00:55.049212 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:55.049149 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v4wf2" podStartSLOduration=2.043662968 podStartE2EDuration="5.049129424s" podCreationTimestamp="2026-04-16 21:00:50 +0000 UTC" firstStartedPulling="2026-04-16 21:00:51.419088567 +0000 UTC m=+154.437399392" lastFinishedPulling="2026-04-16 21:00:54.424555019 +0000 UTC m=+157.442865848" observedRunningTime="2026-04-16 21:00:55.047140169 +0000 UTC m=+158.065451018" watchObservedRunningTime="2026-04-16 21:00:55.049129424 +0000 UTC m=+158.067440273" Apr 16 21:00:57.783355 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.783317 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-p9sbl"] Apr 16 21:00:57.786600 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.786576 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.790139 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.790060 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 21:00:57.790139 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.790127 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 21:00:57.790342 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.790311 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-rrpck\"" Apr 16 21:00:57.790403 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.790370 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 21:00:57.800610 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.800589 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-p9sbl"] Apr 16 21:00:57.824052 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.824017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.824174 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.824080 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7qdj\" (UniqueName: \"kubernetes.io/projected/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-kube-api-access-p7qdj\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.824174 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.824146 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.824266 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.824224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.925032 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.924982 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.925182 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.925058 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7qdj\" (UniqueName: \"kubernetes.io/projected/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-kube-api-access-p7qdj\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.925182 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.925090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.925182 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.925167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.926330 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.926304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.927960 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.927926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.928182 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.928166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:57.934563 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:57.934539 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7qdj\" (UniqueName: \"kubernetes.io/projected/7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694-kube-api-access-p7qdj\") pod \"prometheus-operator-5676c8c784-p9sbl\" (UID: \"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:58.038702 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.038619 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5cdf7dfd-wsszn" event={"ID":"1d4ba783-d587-4d20-b3e3-33e1b55302c3","Type":"ContainerStarted","Data":"680595bcb374814d748c9bb41ebc5186a97e9180c719aef02d131b042f149aeb"} Apr 16 21:00:58.056280 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.056236 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b5cdf7dfd-wsszn" podStartSLOduration=2.251005946 podStartE2EDuration="6.056217564s" podCreationTimestamp="2026-04-16 21:00:52 +0000 UTC" firstStartedPulling="2026-04-16 21:00:53.290178438 +0000 UTC m=+156.308489267" lastFinishedPulling="2026-04-16 21:00:57.095390057 +0000 UTC m=+160.113700885" observedRunningTime="2026-04-16 21:00:58.055710894 +0000 UTC m=+161.074021766" watchObservedRunningTime="2026-04-16 21:00:58.056217564 +0000 UTC m=+161.074528412" Apr 16 21:00:58.097271 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.097240 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" Apr 16 21:00:58.127950 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.127346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 21:00:58.131448 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.131414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9275ffeb-7ec4-4699-976c-7ef980230018-metrics-tls\") pod \"dns-default-lvrfd\" (UID: \"9275ffeb-7ec4-4699-976c-7ef980230018\") " pod="openshift-dns/dns-default-lvrfd" Apr 16 21:00:58.227769 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.227736 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfmp4\"" Apr 16 21:00:58.227953 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.227780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 21:00:58.230828 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.230802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebb8fc4a-f559-45f4-be07-93cd44e25e3a-cert\") pod \"ingress-canary-hv5xj\" (UID: \"ebb8fc4a-f559-45f4-be07-93cd44e25e3a\") " pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 21:00:58.233239 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.233219 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-p9sbl"] Apr 16 21:00:58.235266 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.235244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lvrfd" Apr 16 21:00:58.235611 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:58.235581 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a41b6ff_fd13_4b8b_9c2c_2bdf9179f694.slice/crio-0c0fc9690b74262435622711843bd53609e3b4aa6b96586f11216387f41c4e4b WatchSource:0}: Error finding container 0c0fc9690b74262435622711843bd53609e3b4aa6b96586f11216387f41c4e4b: Status 404 returned error can't find the container with id 0c0fc9690b74262435622711843bd53609e3b4aa6b96586f11216387f41c4e4b Apr 16 21:00:58.366879 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.366846 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lvrfd"] Apr 16 21:00:58.370118 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:58.370087 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9275ffeb_7ec4_4699_976c_7ef980230018.slice/crio-2b584dde3157447f5a366a0796d57688fb5a2b92502b0fafea321445658f1035 WatchSource:0}: Error finding container 2b584dde3157447f5a366a0796d57688fb5a2b92502b0fafea321445658f1035: Status 404 returned error can't find the container with id 2b584dde3157447f5a366a0796d57688fb5a2b92502b0fafea321445658f1035 Apr 16 21:00:58.527807 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.527773 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mhzpf\"" Apr 16 21:00:58.535233 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.535208 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hv5xj" Apr 16 21:00:58.662500 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:58.662474 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hv5xj"] Apr 16 21:00:58.665933 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:00:58.665901 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebb8fc4a_f559_45f4_be07_93cd44e25e3a.slice/crio-603a663df0bcab3eee35c1cd62aaf263662e2aaedd406f362448fe8f35b896d4 WatchSource:0}: Error finding container 603a663df0bcab3eee35c1cd62aaf263662e2aaedd406f362448fe8f35b896d4: Status 404 returned error can't find the container with id 603a663df0bcab3eee35c1cd62aaf263662e2aaedd406f362448fe8f35b896d4 Apr 16 21:00:59.043515 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:59.043451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hv5xj" event={"ID":"ebb8fc4a-f559-45f4-be07-93cd44e25e3a","Type":"ContainerStarted","Data":"603a663df0bcab3eee35c1cd62aaf263662e2aaedd406f362448fe8f35b896d4"} Apr 16 21:00:59.045090 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:59.045053 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvrfd" event={"ID":"9275ffeb-7ec4-4699-976c-7ef980230018","Type":"ContainerStarted","Data":"2b584dde3157447f5a366a0796d57688fb5a2b92502b0fafea321445658f1035"} Apr 16 21:00:59.048538 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:00:59.048230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" event={"ID":"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694","Type":"ContainerStarted","Data":"0c0fc9690b74262435622711843bd53609e3b4aa6b96586f11216387f41c4e4b"} Apr 16 21:01:01.059426 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:01.059068 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvrfd" event={"ID":"9275ffeb-7ec4-4699-976c-7ef980230018","Type":"ContainerStarted","Data":"664f34a3420aff9cec31fd3d46991b1e8546b0e2e864cbc6504b04eefaae9037"} Apr 16 21:01:01.059426 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:01.059115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvrfd" event={"ID":"9275ffeb-7ec4-4699-976c-7ef980230018","Type":"ContainerStarted","Data":"fc00475cc6307f454ba5c69d1199884cd3feb46144629744cbb8a1a2459ca9d1"} Apr 16 21:01:01.059426 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:01.059362 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lvrfd" Apr 16 21:01:01.061331 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:01.061298 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" event={"ID":"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694","Type":"ContainerStarted","Data":"a8f0b66d260f96f52530e9141dd11639c136c3255793edfed8b813114111f47f"} Apr 16 21:01:01.061331 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:01.061333 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" event={"ID":"7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694","Type":"ContainerStarted","Data":"be5c1a52c98ed07819e6bef4c1290f8bb06d7a1406f4d7bb7d8fa6ecb1819103"} Apr 16 21:01:01.079666 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:01.079621 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lvrfd" podStartSLOduration=129.368900287 podStartE2EDuration="2m11.079609299s" podCreationTimestamp="2026-04-16 20:58:50 +0000 UTC" firstStartedPulling="2026-04-16 21:00:58.372479808 +0000 UTC m=+161.390790639" lastFinishedPulling="2026-04-16 21:01:00.083188825 +0000 UTC m=+163.101499651" observedRunningTime="2026-04-16 21:01:01.078023687 +0000 UTC m=+164.096334526" watchObservedRunningTime="2026-04-16 21:01:01.079609299 +0000 UTC m=+164.097920139" Apr 16 21:01:01.101744 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:01.101693 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-p9sbl" podStartSLOduration=2.2619122210000002 podStartE2EDuration="4.101556344s" podCreationTimestamp="2026-04-16 21:00:57 +0000 UTC" firstStartedPulling="2026-04-16 21:00:58.239900679 +0000 UTC m=+161.258211504" lastFinishedPulling="2026-04-16 21:01:00.079544798 +0000 UTC m=+163.097855627" observedRunningTime="2026-04-16 21:01:01.099668534 +0000 UTC m=+164.117979382" watchObservedRunningTime="2026-04-16 21:01:01.101556344 +0000 UTC m=+164.119867196" Apr 16 21:01:02.954645 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:02.954603 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:01:02.955124 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:02.954668 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:01:02.960302 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:02.960069 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:01:03.071728 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.071689 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:01:03.188900 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.188872 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tz7dk"] Apr 16 21:01:03.192677 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.192655 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.196649 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.196247 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 21:01:03.196649 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.196458 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 21:01:03.196649 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.196504 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-l4vt2\"" Apr 16 21:01:03.197033 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.197012 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 21:01:03.272821 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.272775 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-tls\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.272974 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.272828 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87590fab-d517-4092-9065-f48998609b50-node-exporter-textfile\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.272974 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.272861 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-root\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.272974 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.272891 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87590fab-d517-4092-9065-f48998609b50-node-exporter-accelerators-collector-config\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.272974 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.272938 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-node-exporter-wtmp\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.272974 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.272969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-sys\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.273368 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.273029 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swp7v\" (UniqueName: \"kubernetes.io/projected/87590fab-d517-4092-9065-f48998609b50-kube-api-access-swp7v\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.273368 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.273136 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.273368 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.273213 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87590fab-d517-4092-9065-f48998609b50-metrics-client-ca\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374362 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374295 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-sys\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374362 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swp7v\" (UniqueName: \"kubernetes.io/projected/87590fab-d517-4092-9065-f48998609b50-kube-api-access-swp7v\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374391 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-sys\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374449 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87590fab-d517-4092-9065-f48998609b50-metrics-client-ca\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374499 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-tls\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374546 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87590fab-d517-4092-9065-f48998609b50-node-exporter-textfile\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374576 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-root\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374876 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87590fab-d517-4092-9065-f48998609b50-node-exporter-accelerators-collector-config\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374876 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-node-exporter-wtmp\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.374876 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-node-exporter-wtmp\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.375089 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:01:03.374875 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 21:01:03.375089 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.374897 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87590fab-d517-4092-9065-f48998609b50-root\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.375089 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:01:03.374929 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-tls podName:87590fab-d517-4092-9065-f48998609b50 nodeName:}" failed. No retries permitted until 2026-04-16 21:01:03.874910835 +0000 UTC m=+166.893221663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-tls") pod "node-exporter-tz7dk" (UID: "87590fab-d517-4092-9065-f48998609b50") : secret "node-exporter-tls" not found Apr 16 21:01:03.375089 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.375048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87590fab-d517-4092-9065-f48998609b50-metrics-client-ca\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.375302 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.375211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87590fab-d517-4092-9065-f48998609b50-node-exporter-textfile\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.375456 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.375413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87590fab-d517-4092-9065-f48998609b50-node-exporter-accelerators-collector-config\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.377702 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.377683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.387161 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.387114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swp7v\" (UniqueName: \"kubernetes.io/projected/87590fab-d517-4092-9065-f48998609b50-kube-api-access-swp7v\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.878567 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.878534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-tls\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:03.881518 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:03.881478 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87590fab-d517-4092-9065-f48998609b50-node-exporter-tls\") pod \"node-exporter-tz7dk\" (UID: \"87590fab-d517-4092-9065-f48998609b50\") " pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:04.105381 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:04.105313 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tz7dk" Apr 16 21:01:04.562681 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:04.562638 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 21:01:06.151050 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.150971 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv"] Apr 16 21:01:06.155025 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.154975 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.159124 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.158729 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 21:01:06.159124 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.158814 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 21:01:06.159124 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.158823 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-19t3besbf1tka\"" Apr 16 21:01:06.159124 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.158816 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 21:01:06.159124 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.158960 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 21:01:06.159412 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.159150 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 21:01:06.159520 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.159500 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mxq5q\"" Apr 16 21:01:06.167113 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.167091 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv"] Apr 16 21:01:06.300587 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.300550 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-tls\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.300771 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.300602 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-grpc-tls\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.300771 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.300701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.300771 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.300740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.300943 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.300818 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39136697-7696-4144-afb2-1cd8a2e847c7-metrics-client-ca\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.300943 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.300852 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbq7s\" (UniqueName: \"kubernetes.io/projected/39136697-7696-4144-afb2-1cd8a2e847c7-kube-api-access-bbq7s\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.300943 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.300895 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.301071 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.300955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.402167 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.402070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.402167 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.402130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.402377 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.402204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39136697-7696-4144-afb2-1cd8a2e847c7-metrics-client-ca\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.402377 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.402237 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbq7s\" (UniqueName: \"kubernetes.io/projected/39136697-7696-4144-afb2-1cd8a2e847c7-kube-api-access-bbq7s\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.402377 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.402268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.402377 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.402302 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.402377 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.402340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-tls\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.402377 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.402372 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-grpc-tls\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.403709 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.403677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39136697-7696-4144-afb2-1cd8a2e847c7-metrics-client-ca\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.405498 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.405452 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.406220 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.406195 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.406319 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.406249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-grpc-tls\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.406662 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.406625 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-tls\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.406788 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.406765 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.408433 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.408410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/39136697-7696-4144-afb2-1cd8a2e847c7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.418503 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.418481 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbq7s\" (UniqueName: \"kubernetes.io/projected/39136697-7696-4144-afb2-1cd8a2e847c7-kube-api-access-bbq7s\") pod \"thanos-querier-6f9ff59d98-rfmpv\" (UID: \"39136697-7696-4144-afb2-1cd8a2e847c7\") " pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:06.468420 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:06.468392 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:07.707068 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.707035 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-665555d85-gk67l"] Apr 16 21:01:07.711673 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.711641 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.714536 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.714508 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 21:01:07.714675 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.714657 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 21:01:07.714779 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.714759 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 21:01:07.714889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.714869 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1rfq9di9qq9sh\"" Apr 16 21:01:07.714959 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.714950 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 21:01:07.715115 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.715093 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-hmrm2\"" Apr 16 21:01:07.719441 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.719403 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-665555d85-gk67l"] Apr 16 21:01:07.816037 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.816003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-metrics-server-audit-profiles\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.816208 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.816075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-secret-metrics-server-client-certs\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.816208 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.816122 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n84r\" (UniqueName: \"kubernetes.io/projected/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-kube-api-access-8n84r\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.816310 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.816232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.816367 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.816307 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-client-ca-bundle\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.816367 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.816342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-audit-log\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.816464 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.816400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-secret-metrics-server-tls\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.917727 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.917693 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-metrics-server-audit-profiles\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.917903 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.917738 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-secret-metrics-server-client-certs\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.917903 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.917790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n84r\" (UniqueName: \"kubernetes.io/projected/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-kube-api-access-8n84r\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.917903 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.917849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.917903 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.917899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-client-ca-bundle\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.918141 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.917929 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-audit-log\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.918141 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.917965 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-secret-metrics-server-tls\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.918879 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.918777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-metrics-server-audit-profiles\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.919488 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.918980 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.919488 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.919305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-audit-log\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.920862 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.920840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-secret-metrics-server-client-certs\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.922269 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.922246 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-client-ca-bundle\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.922377 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.922363 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-secret-metrics-server-tls\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.932607 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.932586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n84r\" (UniqueName: \"kubernetes.io/projected/1bcca9b9-1d19-45cf-9384-55ca4eb5e043-kube-api-access-8n84r\") pod \"metrics-server-665555d85-gk67l\" (UID: \"1bcca9b9-1d19-45cf-9384-55ca4eb5e043\") " pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:07.936702 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.936685 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9"] Apr 16 21:01:07.942535 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.942511 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" Apr 16 21:01:07.946448 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.946266 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 21:01:07.946673 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.946541 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6dsc4\"" Apr 16 21:01:07.948709 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:07.948685 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9"] Apr 16 21:01:08.018652 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.018625 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cac20c17-60af-477e-8df8-bce3e9e85927-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8xsp9\" (UID: \"cac20c17-60af-477e-8df8-bce3e9e85927\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" Apr 16 21:01:08.023510 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.023491 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:08.119932 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.119898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cac20c17-60af-477e-8df8-bce3e9e85927-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8xsp9\" (UID: \"cac20c17-60af-477e-8df8-bce3e9e85927\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" Apr 16 21:01:08.120124 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:01:08.120091 2579 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 21:01:08.120171 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:01:08.120163 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cac20c17-60af-477e-8df8-bce3e9e85927-monitoring-plugin-cert podName:cac20c17-60af-477e-8df8-bce3e9e85927 nodeName:}" failed. No retries permitted until 2026-04-16 21:01:08.620142113 +0000 UTC m=+171.638452940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/cac20c17-60af-477e-8df8-bce3e9e85927-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-8xsp9" (UID: "cac20c17-60af-477e-8df8-bce3e9e85927") : secret "monitoring-plugin-cert" not found Apr 16 21:01:08.624022 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.623964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cac20c17-60af-477e-8df8-bce3e9e85927-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8xsp9\" (UID: \"cac20c17-60af-477e-8df8-bce3e9e85927\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" Apr 16 21:01:08.626896 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.626865 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cac20c17-60af-477e-8df8-bce3e9e85927-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8xsp9\" (UID: \"cac20c17-60af-477e-8df8-bce3e9e85927\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" Apr 16 21:01:08.852245 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.852206 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-ff64d865d-nh545"] Apr 16 21:01:08.854934 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.854906 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" Apr 16 21:01:08.855900 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.855882 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:08.865234 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.865202 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 21:01:08.866694 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.866668 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ff64d865d-nh545"] Apr 16 21:01:08.904506 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:01:08.904467 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87590fab_d517_4092_9065_f48998609b50.slice/crio-8a9e267bda80489e79305b08bdbfa1169a600cfc55a8bf21b9f2c4977eee4389 WatchSource:0}: Error finding container 8a9e267bda80489e79305b08bdbfa1169a600cfc55a8bf21b9f2c4977eee4389: Status 404 returned error can't find the container with id 8a9e267bda80489e79305b08bdbfa1169a600cfc55a8bf21b9f2c4977eee4389 Apr 16 21:01:08.929031 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.928831 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-trusted-ca-bundle\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:08.929031 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.928917 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrrzn\" (UniqueName: \"kubernetes.io/projected/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-kube-api-access-lrrzn\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:08.929031 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.929014 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-service-ca\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:08.929259 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.929080 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-config\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:08.929259 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.929119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-oauth-config\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:08.929259 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.929163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-oauth-serving-cert\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:08.929259 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:08.929248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-serving-cert\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.031872 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.031443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-serving-cert\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.031872 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.031555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-trusted-ca-bundle\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.031872 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.031604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrrzn\" (UniqueName: \"kubernetes.io/projected/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-kube-api-access-lrrzn\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.031872 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.031653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-service-ca\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.031872 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.031692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-config\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.031872 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.031733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-oauth-config\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.031872 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.031758 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-oauth-serving-cert\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.034320 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.032663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-oauth-serving-cert\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.034320 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.033335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-service-ca\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.034320 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.033934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-config\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.036479 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.036434 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-trusted-ca-bundle\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.042240 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.042191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-oauth-config\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.045320 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.045255 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrrzn\" (UniqueName: \"kubernetes.io/projected/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-kube-api-access-lrrzn\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.046453 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.046430 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-serving-cert\") pod \"console-ff64d865d-nh545\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.076833 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.076779 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv"] Apr 16 21:01:09.081750 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:01:09.081706 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39136697_7696_4144_afb2_1cd8a2e847c7.slice/crio-e3f25f845777af5dbc25279901fc9b4144104a92163dd1c9ae72367624febe94 WatchSource:0}: Error finding container e3f25f845777af5dbc25279901fc9b4144104a92163dd1c9ae72367624febe94: Status 404 returned error can't find the container with id e3f25f845777af5dbc25279901fc9b4144104a92163dd1c9ae72367624febe94 Apr 16 21:01:09.086974 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.086930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hv5xj" event={"ID":"ebb8fc4a-f559-45f4-be07-93cd44e25e3a","Type":"ContainerStarted","Data":"ccc52dd189e4dcb45b59071dfff2349fe4003be9f380daa82659575db8e3ad9e"} Apr 16 21:01:09.088152 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.088111 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" event={"ID":"39136697-7696-4144-afb2-1cd8a2e847c7","Type":"ContainerStarted","Data":"e3f25f845777af5dbc25279901fc9b4144104a92163dd1c9ae72367624febe94"} Apr 16 21:01:09.089553 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.089528 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-md9h2" event={"ID":"d4fdbd49-f42a-49e0-befb-6a581b41d609","Type":"ContainerStarted","Data":"f54ea209d3be34c935b905ae0dc58b05e19e22dae9e19d2e1e27f3f2cd7a1e9a"} Apr 16 21:01:09.089735 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.089720 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-md9h2" Apr 16 21:01:09.090589 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.090567 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tz7dk" event={"ID":"87590fab-d517-4092-9065-f48998609b50","Type":"ContainerStarted","Data":"8a9e267bda80489e79305b08bdbfa1169a600cfc55a8bf21b9f2c4977eee4389"} Apr 16 21:01:09.090884 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.090865 2579 patch_prober.go:28] interesting pod/downloads-6bcc868b7-md9h2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.15:8080/\": dial tcp 10.134.0.15:8080: connect: connection refused" start-of-body= Apr 16 21:01:09.090976 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.090906 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-md9h2" podUID="d4fdbd49-f42a-49e0-befb-6a581b41d609" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.15:8080/\": dial tcp 10.134.0.15:8080: connect: connection refused" Apr 16 21:01:09.094092 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.094072 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-665555d85-gk67l"] Apr 16 21:01:09.097145 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:01:09.097124 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bcca9b9_1d19_45cf_9384_55ca4eb5e043.slice/crio-35d7c60c5a464652055f96e2ce18aed262435c603513012dbb36a025f1aeec96 WatchSource:0}: Error finding container 35d7c60c5a464652055f96e2ce18aed262435c603513012dbb36a025f1aeec96: Status 404 returned error can't find the container with id 35d7c60c5a464652055f96e2ce18aed262435c603513012dbb36a025f1aeec96 Apr 16 21:01:09.114921 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.114895 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9"] Apr 16 21:01:09.115562 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.115517 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hv5xj" podStartSLOduration=128.8805784 podStartE2EDuration="2m19.115502322s" podCreationTimestamp="2026-04-16 20:58:50 +0000 UTC" firstStartedPulling="2026-04-16 21:00:58.668163068 +0000 UTC m=+161.686473897" lastFinishedPulling="2026-04-16 21:01:08.903086982 +0000 UTC m=+171.921397819" observedRunningTime="2026-04-16 21:01:09.112815735 +0000 UTC m=+172.131126581" watchObservedRunningTime="2026-04-16 21:01:09.115502322 +0000 UTC m=+172.133813169" Apr 16 21:01:09.119813 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:01:09.119774 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac20c17_60af_477e_8df8_bce3e9e85927.slice/crio-c12011a41d8eed6786fbd16beccc526f5acf61cd381fcedd544ef096f66997eb WatchSource:0}: Error finding container c12011a41d8eed6786fbd16beccc526f5acf61cd381fcedd544ef096f66997eb: Status 404 returned error can't find the container with id c12011a41d8eed6786fbd16beccc526f5acf61cd381fcedd544ef096f66997eb Apr 16 21:01:09.141892 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.141837 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-md9h2" podStartSLOduration=1.640007664 podStartE2EDuration="19.141818345s" podCreationTimestamp="2026-04-16 21:00:50 +0000 UTC" firstStartedPulling="2026-04-16 21:00:51.435251777 +0000 UTC m=+154.453562616" lastFinishedPulling="2026-04-16 21:01:08.937062463 +0000 UTC m=+171.955373297" observedRunningTime="2026-04-16 21:01:09.140043469 +0000 UTC m=+172.158354317" watchObservedRunningTime="2026-04-16 21:01:09.141818345 +0000 UTC m=+172.160129193" Apr 16 21:01:09.165574 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.165504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:09.328390 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:09.328332 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ff64d865d-nh545"] Apr 16 21:01:09.333139 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:01:09.333099 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c3649b_3d37_47a9_97c2_d25ea5fa28f5.slice/crio-d90bdbe425650964e663530724e7481809aa92e928f79a319cc08232ba8270a3 WatchSource:0}: Error finding container d90bdbe425650964e663530724e7481809aa92e928f79a319cc08232ba8270a3: Status 404 returned error can't find the container with id d90bdbe425650964e663530724e7481809aa92e928f79a319cc08232ba8270a3 Apr 16 21:01:10.100071 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:10.099047 2579 generic.go:358] "Generic (PLEG): container finished" podID="87590fab-d517-4092-9065-f48998609b50" containerID="be3c3322a750849e6975255e326ed46daecc9ec9accc5908c7100b4ae7440825" exitCode=0 Apr 16 21:01:10.100071 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:10.099127 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tz7dk" event={"ID":"87590fab-d517-4092-9065-f48998609b50","Type":"ContainerDied","Data":"be3c3322a750849e6975255e326ed46daecc9ec9accc5908c7100b4ae7440825"} Apr 16 21:01:10.101834 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:10.101807 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-665555d85-gk67l" event={"ID":"1bcca9b9-1d19-45cf-9384-55ca4eb5e043","Type":"ContainerStarted","Data":"35d7c60c5a464652055f96e2ce18aed262435c603513012dbb36a025f1aeec96"} Apr 16 21:01:10.103588 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:10.103504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" event={"ID":"cac20c17-60af-477e-8df8-bce3e9e85927","Type":"ContainerStarted","Data":"c12011a41d8eed6786fbd16beccc526f5acf61cd381fcedd544ef096f66997eb"} Apr 16 21:01:10.106873 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:10.106844 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff64d865d-nh545" event={"ID":"71c3649b-3d37-47a9-97c2-d25ea5fa28f5","Type":"ContainerStarted","Data":"bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84"} Apr 16 21:01:10.106984 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:10.106881 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff64d865d-nh545" event={"ID":"71c3649b-3d37-47a9-97c2-d25ea5fa28f5","Type":"ContainerStarted","Data":"d90bdbe425650964e663530724e7481809aa92e928f79a319cc08232ba8270a3"} Apr 16 21:01:10.125050 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:10.125020 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-md9h2" Apr 16 21:01:10.142402 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:10.142201 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-ff64d865d-nh545" podStartSLOduration=2.142183277 podStartE2EDuration="2.142183277s" podCreationTimestamp="2026-04-16 21:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:01:10.140382504 +0000 UTC m=+173.158693351" watchObservedRunningTime="2026-04-16 21:01:10.142183277 +0000 UTC m=+173.160494143" Apr 16 21:01:11.067188 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:11.067158 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lvrfd" Apr 16 21:01:11.113436 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:11.113294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tz7dk" event={"ID":"87590fab-d517-4092-9065-f48998609b50","Type":"ContainerStarted","Data":"d401b0d46fc2d494bca56c2e84aeb3e8d27394b9c8f0bdcd083cfde977dfd8cb"} Apr 16 21:01:11.113436 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:11.113348 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tz7dk" event={"ID":"87590fab-d517-4092-9065-f48998609b50","Type":"ContainerStarted","Data":"ea30292cca1762b5c641b7d28488ad3e7310fae18cd531ab47547609d845b6d1"} Apr 16 21:01:11.136191 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:11.136146 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tz7dk" podStartSLOduration=7.225151481 podStartE2EDuration="8.136127095s" podCreationTimestamp="2026-04-16 21:01:03 +0000 UTC" firstStartedPulling="2026-04-16 21:01:08.906096789 +0000 UTC m=+171.924407621" lastFinishedPulling="2026-04-16 21:01:09.817072407 +0000 UTC m=+172.835383235" observedRunningTime="2026-04-16 21:01:11.134830745 +0000 UTC m=+174.153141606" watchObservedRunningTime="2026-04-16 21:01:11.136127095 +0000 UTC m=+174.154437942" Apr 16 21:01:13.123853 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:13.123819 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" event={"ID":"cac20c17-60af-477e-8df8-bce3e9e85927","Type":"ContainerStarted","Data":"21e290f46635af8d1fa8ac7433ef962c0e8285ca405be58b9209edf0b6a290da"} Apr 16 21:01:13.124234 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:13.124101 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" Apr 16 21:01:13.127451 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:13.127427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" event={"ID":"39136697-7696-4144-afb2-1cd8a2e847c7","Type":"ContainerStarted","Data":"e91f5e8653a90a9f9b8644853e66cf6c404ec69cf34f88a2088e97b97c66d8cb"} Apr 16 21:01:13.129518 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:13.129494 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-665555d85-gk67l" event={"ID":"1bcca9b9-1d19-45cf-9384-55ca4eb5e043","Type":"ContainerStarted","Data":"2edf6f1a85384686300b13fea145d1eb96245930d64e0ed12ddc32d4a0cfe7ca"} Apr 16 21:01:13.130497 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:13.130479 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" Apr 16 21:01:13.140979 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:13.140929 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xsp9" podStartSLOduration=2.414254088 podStartE2EDuration="6.140915718s" podCreationTimestamp="2026-04-16 21:01:07 +0000 UTC" firstStartedPulling="2026-04-16 21:01:09.121784 +0000 UTC m=+172.140094826" lastFinishedPulling="2026-04-16 21:01:12.848445631 +0000 UTC m=+175.866756456" observedRunningTime="2026-04-16 21:01:13.139971687 +0000 UTC m=+176.158282535" watchObservedRunningTime="2026-04-16 21:01:13.140915718 +0000 UTC m=+176.159226568" Apr 16 21:01:13.178618 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:13.178537 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-665555d85-gk67l" podStartSLOduration=2.429180677 podStartE2EDuration="6.178516332s" podCreationTimestamp="2026-04-16 21:01:07 +0000 UTC" firstStartedPulling="2026-04-16 21:01:09.098899491 +0000 UTC m=+172.117210316" lastFinishedPulling="2026-04-16 21:01:12.848235127 +0000 UTC m=+175.866545971" observedRunningTime="2026-04-16 21:01:13.175808426 +0000 UTC m=+176.194119278" watchObservedRunningTime="2026-04-16 21:01:13.178516332 +0000 UTC m=+176.196827180" Apr 16 21:01:14.135218 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:14.135185 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" event={"ID":"39136697-7696-4144-afb2-1cd8a2e847c7","Type":"ContainerStarted","Data":"c6c9616de51677023840cd77fefad37d5d5dcc931b2e247a7384a28cf3f4b433"} Apr 16 21:01:14.135729 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:14.135225 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" event={"ID":"39136697-7696-4144-afb2-1cd8a2e847c7","Type":"ContainerStarted","Data":"1cca7eaf885462829295178de284188ffa6013281d3c76f5b7ffc028cf77d77a"} Apr 16 21:01:14.507044 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:14.507007 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ff64d865d-nh545"] Apr 16 21:01:15.142271 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:15.142180 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" event={"ID":"39136697-7696-4144-afb2-1cd8a2e847c7","Type":"ContainerStarted","Data":"6f58c262daa6aaaa247b30bc5e7e99d63f20c34865c0b462dd4f26b3fed02d77"} Apr 16 21:01:15.142271 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:15.142225 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" event={"ID":"39136697-7696-4144-afb2-1cd8a2e847c7","Type":"ContainerStarted","Data":"fb1e523e874ac5faa798c7f3a67312b176901d630369bd6d11fcfc18770ad44e"} Apr 16 21:01:15.142271 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:15.142238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" event={"ID":"39136697-7696-4144-afb2-1cd8a2e847c7","Type":"ContainerStarted","Data":"d470a3c588fc77e46152abd6c73feee1759d7f185c9fe8ec8ae3f0059ec9fe4c"} Apr 16 21:01:15.166041 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:15.165975 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" podStartSLOduration=3.724679538 podStartE2EDuration="9.1659576s" podCreationTimestamp="2026-04-16 21:01:06 +0000 UTC" firstStartedPulling="2026-04-16 21:01:09.083811486 +0000 UTC m=+172.102122312" lastFinishedPulling="2026-04-16 21:01:14.525089546 +0000 UTC m=+177.543400374" observedRunningTime="2026-04-16 21:01:15.164536008 +0000 UTC m=+178.182846855" watchObservedRunningTime="2026-04-16 21:01:15.1659576 +0000 UTC m=+178.184268448" Apr 16 21:01:16.145784 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:16.145749 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:19.166083 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:19.166046 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:22.156459 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:22.156429 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6f9ff59d98-rfmpv" Apr 16 21:01:25.061930 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:25.061890 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b5cdf7dfd-wsszn"] Apr 16 21:01:28.024093 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:28.024059 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:28.024531 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:28.024106 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:39.531395 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.531315 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-ff64d865d-nh545" podUID="71c3649b-3d37-47a9-97c2-d25ea5fa28f5" containerName="console" containerID="cri-o://bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84" gracePeriod=15 Apr 16 21:01:39.786164 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.786141 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ff64d865d-nh545_71c3649b-3d37-47a9-97c2-d25ea5fa28f5/console/0.log" Apr 16 21:01:39.786285 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.786213 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:39.905293 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905266 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrrzn\" (UniqueName: \"kubernetes.io/projected/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-kube-api-access-lrrzn\") pod \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " Apr 16 21:01:39.905293 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905299 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-oauth-serving-cert\") pod \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " Apr 16 21:01:39.905534 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905324 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-serving-cert\") pod \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " Apr 16 21:01:39.905534 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905349 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-service-ca\") pod \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " Apr 16 21:01:39.905534 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905396 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-oauth-config\") pod \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " Apr 16 21:01:39.905534 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905421 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-config\") pod \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " Apr 16 21:01:39.905534 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905457 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-trusted-ca-bundle\") pod \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\" (UID: \"71c3649b-3d37-47a9-97c2-d25ea5fa28f5\") " Apr 16 21:01:39.905788 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905735 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "71c3649b-3d37-47a9-97c2-d25ea5fa28f5" (UID: "71c3649b-3d37-47a9-97c2-d25ea5fa28f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:39.905939 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905849 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "71c3649b-3d37-47a9-97c2-d25ea5fa28f5" (UID: "71c3649b-3d37-47a9-97c2-d25ea5fa28f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:39.905939 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.905909 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-config" (OuterVolumeSpecName: "console-config") pod "71c3649b-3d37-47a9-97c2-d25ea5fa28f5" (UID: "71c3649b-3d37-47a9-97c2-d25ea5fa28f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:39.906149 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.906127 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "71c3649b-3d37-47a9-97c2-d25ea5fa28f5" (UID: "71c3649b-3d37-47a9-97c2-d25ea5fa28f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:39.907765 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.907742 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "71c3649b-3d37-47a9-97c2-d25ea5fa28f5" (UID: "71c3649b-3d37-47a9-97c2-d25ea5fa28f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:01:39.908116 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.908092 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "71c3649b-3d37-47a9-97c2-d25ea5fa28f5" (UID: "71c3649b-3d37-47a9-97c2-d25ea5fa28f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:01:39.908116 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:39.908098 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-kube-api-access-lrrzn" (OuterVolumeSpecName: "kube-api-access-lrrzn") pod "71c3649b-3d37-47a9-97c2-d25ea5fa28f5" (UID: "71c3649b-3d37-47a9-97c2-d25ea5fa28f5"). InnerVolumeSpecName "kube-api-access-lrrzn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:01:40.007153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.007072 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrrzn\" (UniqueName: \"kubernetes.io/projected/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-kube-api-access-lrrzn\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:40.007153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.007099 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-oauth-serving-cert\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:40.007153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.007109 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-serving-cert\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:40.007153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.007120 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-service-ca\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:40.007153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.007131 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-oauth-config\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:40.007153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.007141 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-console-config\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:40.007153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.007149 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71c3649b-3d37-47a9-97c2-d25ea5fa28f5-trusted-ca-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:40.214693 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.214654 2579 generic.go:358] "Generic (PLEG): container finished" podID="5a07ff8c-d53e-4fcd-93b5-f85c40522a10" containerID="b2156ea50db98952a596d597243705e580eb5cd095f9a8c73b0720466fd6fe22" exitCode=0 Apr 16 21:01:40.214882 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.214736 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" event={"ID":"5a07ff8c-d53e-4fcd-93b5-f85c40522a10","Type":"ContainerDied","Data":"b2156ea50db98952a596d597243705e580eb5cd095f9a8c73b0720466fd6fe22"} Apr 16 21:01:40.215140 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.215116 2579 scope.go:117] "RemoveContainer" containerID="b2156ea50db98952a596d597243705e580eb5cd095f9a8c73b0720466fd6fe22" Apr 16 21:01:40.215956 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.215942 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ff64d865d-nh545_71c3649b-3d37-47a9-97c2-d25ea5fa28f5/console/0.log" Apr 16 21:01:40.216049 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.215976 2579 generic.go:358] "Generic (PLEG): container finished" podID="71c3649b-3d37-47a9-97c2-d25ea5fa28f5" containerID="bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84" exitCode=2 Apr 16 21:01:40.216049 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.216033 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff64d865d-nh545" event={"ID":"71c3649b-3d37-47a9-97c2-d25ea5fa28f5","Type":"ContainerDied","Data":"bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84"} Apr 16 21:01:40.216116 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.216050 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ff64d865d-nh545" Apr 16 21:01:40.216116 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.216063 2579 scope.go:117] "RemoveContainer" containerID="bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84" Apr 16 21:01:40.216255 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.216054 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ff64d865d-nh545" event={"ID":"71c3649b-3d37-47a9-97c2-d25ea5fa28f5","Type":"ContainerDied","Data":"d90bdbe425650964e663530724e7481809aa92e928f79a319cc08232ba8270a3"} Apr 16 21:01:40.224268 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.224218 2579 scope.go:117] "RemoveContainer" containerID="bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84" Apr 16 21:01:40.224532 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:01:40.224511 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84\": container with ID starting with bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84 not found: ID does not exist" containerID="bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84" Apr 16 21:01:40.224603 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.224541 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84"} err="failed to get container status \"bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84\": rpc error: code = NotFound desc = could not find container \"bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84\": container with ID starting with bc8114ed90ed20407484d8549b04518eb623672874beeb24819fe9ed37dfcf84 not found: ID does not exist" Apr 16 21:01:40.247226 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.247204 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ff64d865d-nh545"] Apr 16 21:01:40.252627 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:40.252606 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-ff64d865d-nh545"] Apr 16 21:01:41.220484 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:41.220450 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fvhtx" event={"ID":"5a07ff8c-d53e-4fcd-93b5-f85c40522a10","Type":"ContainerStarted","Data":"4b65538f7b72a55331c36808a70a83bdb5d8e31bdec85fb07bfb158c4b32de99"} Apr 16 21:01:41.566351 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:41.566265 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c3649b-3d37-47a9-97c2-d25ea5fa28f5" path="/var/lib/kubelet/pods/71c3649b-3d37-47a9-97c2-d25ea5fa28f5/volumes" Apr 16 21:01:48.029418 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:48.029389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:48.033488 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:48.033466 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-665555d85-gk67l" Apr 16 21:01:50.081438 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.081370 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b5cdf7dfd-wsszn" podUID="1d4ba783-d587-4d20-b3e3-33e1b55302c3" containerName="console" containerID="cri-o://680595bcb374814d748c9bb41ebc5186a97e9180c719aef02d131b042f149aeb" gracePeriod=15 Apr 16 21:01:50.247709 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.247682 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b5cdf7dfd-wsszn_1d4ba783-d587-4d20-b3e3-33e1b55302c3/console/0.log" Apr 16 21:01:50.247841 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.247719 2579 generic.go:358] "Generic (PLEG): container finished" podID="1d4ba783-d587-4d20-b3e3-33e1b55302c3" containerID="680595bcb374814d748c9bb41ebc5186a97e9180c719aef02d131b042f149aeb" exitCode=2 Apr 16 21:01:50.247841 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.247813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5cdf7dfd-wsszn" event={"ID":"1d4ba783-d587-4d20-b3e3-33e1b55302c3","Type":"ContainerDied","Data":"680595bcb374814d748c9bb41ebc5186a97e9180c719aef02d131b042f149aeb"} Apr 16 21:01:50.342255 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.342234 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b5cdf7dfd-wsszn_1d4ba783-d587-4d20-b3e3-33e1b55302c3/console/0.log" Apr 16 21:01:50.342377 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.342296 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:01:50.398104 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398072 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-serving-cert\") pod \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " Apr 16 21:01:50.398254 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398156 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-service-ca\") pod \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " Apr 16 21:01:50.398303 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398264 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd2f8\" (UniqueName: \"kubernetes.io/projected/1d4ba783-d587-4d20-b3e3-33e1b55302c3-kube-api-access-xd2f8\") pod \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " Apr 16 21:01:50.398374 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398309 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-oauth-serving-cert\") pod \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " Apr 16 21:01:50.398374 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398338 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-config\") pod \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " Apr 16 21:01:50.398374 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398370 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-oauth-config\") pod \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\" (UID: \"1d4ba783-d587-4d20-b3e3-33e1b55302c3\") " Apr 16 21:01:50.398524 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398495 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-service-ca" (OuterVolumeSpecName: "service-ca") pod "1d4ba783-d587-4d20-b3e3-33e1b55302c3" (UID: "1d4ba783-d587-4d20-b3e3-33e1b55302c3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:50.398693 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398671 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-config" (OuterVolumeSpecName: "console-config") pod "1d4ba783-d587-4d20-b3e3-33e1b55302c3" (UID: "1d4ba783-d587-4d20-b3e3-33e1b55302c3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:50.398788 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398692 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1d4ba783-d587-4d20-b3e3-33e1b55302c3" (UID: "1d4ba783-d587-4d20-b3e3-33e1b55302c3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:50.398788 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.398729 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-service-ca\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:50.400472 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.400449 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1d4ba783-d587-4d20-b3e3-33e1b55302c3" (UID: "1d4ba783-d587-4d20-b3e3-33e1b55302c3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:01:50.400574 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.400556 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1d4ba783-d587-4d20-b3e3-33e1b55302c3" (UID: "1d4ba783-d587-4d20-b3e3-33e1b55302c3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:01:50.400614 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.400580 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d4ba783-d587-4d20-b3e3-33e1b55302c3-kube-api-access-xd2f8" (OuterVolumeSpecName: "kube-api-access-xd2f8") pod "1d4ba783-d587-4d20-b3e3-33e1b55302c3" (UID: "1d4ba783-d587-4d20-b3e3-33e1b55302c3"). InnerVolumeSpecName "kube-api-access-xd2f8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:01:50.499389 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.499348 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xd2f8\" (UniqueName: \"kubernetes.io/projected/1d4ba783-d587-4d20-b3e3-33e1b55302c3-kube-api-access-xd2f8\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:50.499389 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.499384 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-oauth-serving-cert\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:50.499563 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.499398 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-config\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:50.499563 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.499412 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-oauth-config\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:50.499563 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:50.499426 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d4ba783-d587-4d20-b3e3-33e1b55302c3-console-serving-cert\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:01:51.251187 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:51.251159 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b5cdf7dfd-wsszn_1d4ba783-d587-4d20-b3e3-33e1b55302c3/console/0.log" Apr 16 21:01:51.251591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:51.251218 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5cdf7dfd-wsszn" event={"ID":"1d4ba783-d587-4d20-b3e3-33e1b55302c3","Type":"ContainerDied","Data":"9bd97b143bd552ec7e59d38dc92865bcbed48dfb067d57bfab12257dd22796e9"} Apr 16 21:01:51.251591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:51.251247 2579 scope.go:117] "RemoveContainer" containerID="680595bcb374814d748c9bb41ebc5186a97e9180c719aef02d131b042f149aeb" Apr 16 21:01:51.251591 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:51.251287 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5cdf7dfd-wsszn" Apr 16 21:01:51.282274 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:51.282248 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b5cdf7dfd-wsszn"] Apr 16 21:01:51.284664 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:51.284642 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b5cdf7dfd-wsszn"] Apr 16 21:01:51.566534 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:51.566456 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d4ba783-d587-4d20-b3e3-33e1b55302c3" path="/var/lib/kubelet/pods/1d4ba783-d587-4d20-b3e3-33e1b55302c3/volumes" Apr 16 21:01:52.256046 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:52.256015 2579 generic.go:358] "Generic (PLEG): container finished" podID="acccf14f-1908-45d6-a352-e0a6c7fc6a05" containerID="169fa441efd96669e086a2acf07194982a09e2f9f710eeb4bb109406a26423b4" exitCode=0 Apr 16 21:01:52.256378 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:52.256058 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" event={"ID":"acccf14f-1908-45d6-a352-e0a6c7fc6a05","Type":"ContainerDied","Data":"169fa441efd96669e086a2acf07194982a09e2f9f710eeb4bb109406a26423b4"} Apr 16 21:01:52.256378 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:52.256373 2579 scope.go:117] "RemoveContainer" containerID="169fa441efd96669e086a2acf07194982a09e2f9f710eeb4bb109406a26423b4" Apr 16 21:01:53.260745 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:53.260714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-p9hsx" event={"ID":"acccf14f-1908-45d6-a352-e0a6c7fc6a05","Type":"ContainerStarted","Data":"14316f9c9b31e3e6d1f7deb99c2c41a0d727df05e8ad6d3cd6ccfe80b6c885bf"} Apr 16 21:01:57.272765 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:57.272735 2579 generic.go:358] "Generic (PLEG): container finished" podID="b425b7f9-0015-4de7-81d2-02cd12eb338a" containerID="f026dd099b6d39005a50670b2bfdcbe547635a39f259c8c184058cd40eb4a931" exitCode=0 Apr 16 21:01:57.273181 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:57.272809 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ntt68" event={"ID":"b425b7f9-0015-4de7-81d2-02cd12eb338a","Type":"ContainerDied","Data":"f026dd099b6d39005a50670b2bfdcbe547635a39f259c8c184058cd40eb4a931"} Apr 16 21:01:57.273232 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:57.273219 2579 scope.go:117] "RemoveContainer" containerID="f026dd099b6d39005a50670b2bfdcbe547635a39f259c8c184058cd40eb4a931" Apr 16 21:01:58.277508 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:01:58.277478 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ntt68" event={"ID":"b425b7f9-0015-4de7-81d2-02cd12eb338a","Type":"ContainerStarted","Data":"5280b553fde79a50f3d366746e46d550826caa7aadf24be76963759228a1030b"} Apr 16 21:02:20.759640 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.759611 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68447f845-77mw9"] Apr 16 21:02:20.760055 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.760020 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71c3649b-3d37-47a9-97c2-d25ea5fa28f5" containerName="console" Apr 16 21:02:20.760055 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.760038 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c3649b-3d37-47a9-97c2-d25ea5fa28f5" containerName="console" Apr 16 21:02:20.760126 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.760065 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d4ba783-d587-4d20-b3e3-33e1b55302c3" containerName="console" Apr 16 21:02:20.760126 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.760074 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4ba783-d587-4d20-b3e3-33e1b55302c3" containerName="console" Apr 16 21:02:20.760185 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.760167 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d4ba783-d587-4d20-b3e3-33e1b55302c3" containerName="console" Apr 16 21:02:20.760185 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.760182 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="71c3649b-3d37-47a9-97c2-d25ea5fa28f5" containerName="console" Apr 16 21:02:20.763484 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.763465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.766895 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.766874 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 21:02:20.767038 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.766900 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 21:02:20.767038 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.766948 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 21:02:20.768143 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.768123 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 21:02:20.768143 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.768127 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 21:02:20.768341 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.768281 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vpj6c\"" Apr 16 21:02:20.772468 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.772447 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 21:02:20.778317 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.778297 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68447f845-77mw9"] Apr 16 21:02:20.847160 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.847131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-serving-cert\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.847334 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.847189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-oauth-config\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.847334 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.847240 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-oauth-serving-cert\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.847334 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.847305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76ldg\" (UniqueName: \"kubernetes.io/projected/6c59c9ca-1833-4c75-8988-ce2a2827e44b-kube-api-access-76ldg\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.847444 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.847333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-config\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.847444 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.847409 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-trusted-ca-bundle\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.847517 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.847442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-service-ca\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.948784 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.948746 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76ldg\" (UniqueName: \"kubernetes.io/projected/6c59c9ca-1833-4c75-8988-ce2a2827e44b-kube-api-access-76ldg\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.948784 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.948786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-config\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949033 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.948810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-trusted-ca-bundle\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949033 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.948831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-service-ca\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949033 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.948865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-serving-cert\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949033 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.948935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-oauth-config\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949033 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.948958 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-oauth-serving-cert\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949643 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.949612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-config\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949704 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.949692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-oauth-serving-cert\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949740 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.949728 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-trusted-ca-bundle\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.949865 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.949837 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-service-ca\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.951617 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.951588 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-oauth-config\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.951758 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.951739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-serving-cert\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:20.963061 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:20.963039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76ldg\" (UniqueName: \"kubernetes.io/projected/6c59c9ca-1833-4c75-8988-ce2a2827e44b-kube-api-access-76ldg\") pod \"console-68447f845-77mw9\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:21.073164 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:21.073073 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:21.202888 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:21.202854 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68447f845-77mw9"] Apr 16 21:02:21.206557 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:02:21.206531 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c59c9ca_1833_4c75_8988_ce2a2827e44b.slice/crio-45d971d245ba5ecc03a6a070b56a4621d8bdd15c72d7483df0e6bf12910500b4 WatchSource:0}: Error finding container 45d971d245ba5ecc03a6a070b56a4621d8bdd15c72d7483df0e6bf12910500b4: Status 404 returned error can't find the container with id 45d971d245ba5ecc03a6a070b56a4621d8bdd15c72d7483df0e6bf12910500b4 Apr 16 21:02:21.350416 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:21.350337 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68447f845-77mw9" event={"ID":"6c59c9ca-1833-4c75-8988-ce2a2827e44b","Type":"ContainerStarted","Data":"cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166"} Apr 16 21:02:21.350416 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:21.350372 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68447f845-77mw9" event={"ID":"6c59c9ca-1833-4c75-8988-ce2a2827e44b","Type":"ContainerStarted","Data":"45d971d245ba5ecc03a6a070b56a4621d8bdd15c72d7483df0e6bf12910500b4"} Apr 16 21:02:21.379610 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:21.379562 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68447f845-77mw9" podStartSLOduration=1.379548125 podStartE2EDuration="1.379548125s" podCreationTimestamp="2026-04-16 21:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:02:21.378927096 +0000 UTC m=+244.397237944" watchObservedRunningTime="2026-04-16 21:02:21.379548125 +0000 UTC m=+244.397858972" Apr 16 21:02:29.320696 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:29.320655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 21:02:29.323072 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:29.323051 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24ed89ef-c93c-40fd-a75f-2f3fd7582359-metrics-certs\") pod \"network-metrics-daemon-rgzx9\" (UID: \"24ed89ef-c93c-40fd-a75f-2f3fd7582359\") " pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 21:02:29.467491 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:29.467456 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lvzr6\"" Apr 16 21:02:29.475144 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:29.475124 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgzx9" Apr 16 21:02:29.596686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:29.596663 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rgzx9"] Apr 16 21:02:29.598809 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:02:29.598780 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24ed89ef_c93c_40fd_a75f_2f3fd7582359.slice/crio-d040de5f435f6e069ff7659a808949180b4f57252817db070fd10713a4395cbe WatchSource:0}: Error finding container d040de5f435f6e069ff7659a808949180b4f57252817db070fd10713a4395cbe: Status 404 returned error can't find the container with id d040de5f435f6e069ff7659a808949180b4f57252817db070fd10713a4395cbe Apr 16 21:02:30.376512 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:30.376468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rgzx9" event={"ID":"24ed89ef-c93c-40fd-a75f-2f3fd7582359","Type":"ContainerStarted","Data":"d040de5f435f6e069ff7659a808949180b4f57252817db070fd10713a4395cbe"} Apr 16 21:02:31.073312 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:31.073232 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:31.073312 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:31.073275 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:31.077974 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:31.077951 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:31.380866 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:31.380768 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rgzx9" event={"ID":"24ed89ef-c93c-40fd-a75f-2f3fd7582359","Type":"ContainerStarted","Data":"b0d4738b0da781293fac733b950aa05b1eabf8a89030f812bbc1f91b90070198"} Apr 16 21:02:31.380866 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:31.380818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rgzx9" event={"ID":"24ed89ef-c93c-40fd-a75f-2f3fd7582359","Type":"ContainerStarted","Data":"d611adab2ae19eeab6364f89ea089309094acfb49ef3ad1bb849a18d4b614e44"} Apr 16 21:02:31.384870 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:31.384850 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68447f845-77mw9" Apr 16 21:02:31.400514 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:02:31.400468 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rgzx9" podStartSLOduration=253.23345463 podStartE2EDuration="4m14.40045371s" podCreationTimestamp="2026-04-16 20:58:17 +0000 UTC" firstStartedPulling="2026-04-16 21:02:29.600615487 +0000 UTC m=+252.618926311" lastFinishedPulling="2026-04-16 21:02:30.767614557 +0000 UTC m=+253.785925391" observedRunningTime="2026-04-16 21:02:31.399871462 +0000 UTC m=+254.418182310" watchObservedRunningTime="2026-04-16 21:02:31.40045371 +0000 UTC m=+254.418764558" Apr 16 21:03:11.951056 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:11.951021 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ww9g9"] Apr 16 21:03:11.953223 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:11.953205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:11.956727 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:11.956705 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 21:03:11.969436 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:11.969416 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ww9g9"] Apr 16 21:03:12.057014 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.056964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75834fc1-5f34-4f79-be84-f3349cdd5efd-dbus\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.057145 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.057019 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75834fc1-5f34-4f79-be84-f3349cdd5efd-kubelet-config\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.057145 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.057077 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75834fc1-5f34-4f79-be84-f3349cdd5efd-original-pull-secret\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.157612 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.157590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75834fc1-5f34-4f79-be84-f3349cdd5efd-original-pull-secret\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.157745 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.157667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75834fc1-5f34-4f79-be84-f3349cdd5efd-dbus\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.157745 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.157695 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75834fc1-5f34-4f79-be84-f3349cdd5efd-kubelet-config\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.157851 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.157787 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75834fc1-5f34-4f79-be84-f3349cdd5efd-kubelet-config\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.157851 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.157831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75834fc1-5f34-4f79-be84-f3349cdd5efd-dbus\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.159976 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.159956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75834fc1-5f34-4f79-be84-f3349cdd5efd-original-pull-secret\") pod \"global-pull-secret-syncer-ww9g9\" (UID: \"75834fc1-5f34-4f79-be84-f3349cdd5efd\") " pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.262150 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.262118 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ww9g9" Apr 16 21:03:12.383388 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.383364 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ww9g9"] Apr 16 21:03:12.385817 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:03:12.385789 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75834fc1_5f34_4f79_be84_f3349cdd5efd.slice/crio-5b45f83fdf26c4cbadea1ce94d8dae65e38eb5d6f64a0e5a1d80180e71c893d7 WatchSource:0}: Error finding container 5b45f83fdf26c4cbadea1ce94d8dae65e38eb5d6f64a0e5a1d80180e71c893d7: Status 404 returned error can't find the container with id 5b45f83fdf26c4cbadea1ce94d8dae65e38eb5d6f64a0e5a1d80180e71c893d7 Apr 16 21:03:12.502464 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:12.502429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ww9g9" event={"ID":"75834fc1-5f34-4f79-be84-f3349cdd5efd","Type":"ContainerStarted","Data":"5b45f83fdf26c4cbadea1ce94d8dae65e38eb5d6f64a0e5a1d80180e71c893d7"} Apr 16 21:03:17.464191 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:17.464165 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:03:17.464888 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:17.464864 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:03:17.467766 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:17.467745 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 21:03:17.518749 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:17.518624 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ww9g9" event={"ID":"75834fc1-5f34-4f79-be84-f3349cdd5efd","Type":"ContainerStarted","Data":"09dc3a48317fa80ad342ebc3ecb50f673e56b944cbdbaf5963787baafae8a1a1"} Apr 16 21:03:17.537089 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:17.537048 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ww9g9" podStartSLOduration=1.886714382 podStartE2EDuration="6.537037113s" podCreationTimestamp="2026-04-16 21:03:11 +0000 UTC" firstStartedPulling="2026-04-16 21:03:12.387404523 +0000 UTC m=+295.405715361" lastFinishedPulling="2026-04-16 21:03:17.037727259 +0000 UTC m=+300.056038092" observedRunningTime="2026-04-16 21:03:17.535751015 +0000 UTC m=+300.554061863" watchObservedRunningTime="2026-04-16 21:03:17.537037113 +0000 UTC m=+300.555347971" Apr 16 21:03:23.706392 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.706357 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4"] Apr 16 21:03:23.708616 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.708600 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.711408 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.711380 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 21:03:23.711533 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.711465 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qvf78\"" Apr 16 21:03:23.712805 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.712787 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 21:03:23.721604 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.721584 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4"] Apr 16 21:03:23.751958 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.751933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrbb\" (UniqueName: \"kubernetes.io/projected/27a09a7a-b398-4891-a309-5aadec642aba-kube-api-access-hsrbb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.752087 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.752010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.752087 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.752047 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.852918 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.852885 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.853058 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.852927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.853058 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.852975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrbb\" (UniqueName: \"kubernetes.io/projected/27a09a7a-b398-4891-a309-5aadec642aba-kube-api-access-hsrbb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.853272 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.853256 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.853348 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.853316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:23.861447 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:23.861426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrbb\" (UniqueName: \"kubernetes.io/projected/27a09a7a-b398-4891-a309-5aadec642aba-kube-api-access-hsrbb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:24.018255 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:24.018221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:24.137177 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:24.137156 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4"] Apr 16 21:03:24.139633 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:03:24.139605 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27a09a7a_b398_4891_a309_5aadec642aba.slice/crio-c049e5c8739e21f7a8160527a54317b0a1b9be020e6b3eefd96891da1dbb2850 WatchSource:0}: Error finding container c049e5c8739e21f7a8160527a54317b0a1b9be020e6b3eefd96891da1dbb2850: Status 404 returned error can't find the container with id c049e5c8739e21f7a8160527a54317b0a1b9be020e6b3eefd96891da1dbb2850 Apr 16 21:03:24.141370 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:24.141353 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:03:24.540417 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:24.540386 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" event={"ID":"27a09a7a-b398-4891-a309-5aadec642aba","Type":"ContainerStarted","Data":"c049e5c8739e21f7a8160527a54317b0a1b9be020e6b3eefd96891da1dbb2850"} Apr 16 21:03:29.558062 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:29.558026 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" event={"ID":"27a09a7a-b398-4891-a309-5aadec642aba","Type":"ContainerStarted","Data":"be5aa815993b2dcdb7f17bb40c6434c2a662f7fe849d3486d3dc4493fd53a502"} Apr 16 21:03:30.562981 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:30.562943 2579 generic.go:358] "Generic (PLEG): container finished" podID="27a09a7a-b398-4891-a309-5aadec642aba" containerID="be5aa815993b2dcdb7f17bb40c6434c2a662f7fe849d3486d3dc4493fd53a502" exitCode=0 Apr 16 21:03:30.563389 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:30.563018 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" event={"ID":"27a09a7a-b398-4891-a309-5aadec642aba","Type":"ContainerDied","Data":"be5aa815993b2dcdb7f17bb40c6434c2a662f7fe849d3486d3dc4493fd53a502"} Apr 16 21:03:33.574147 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:33.574113 2579 generic.go:358] "Generic (PLEG): container finished" podID="27a09a7a-b398-4891-a309-5aadec642aba" containerID="6e45aa14c45143c34cfc1adb5de964c391cecf185bb90cfbae2affb46d09c736" exitCode=0 Apr 16 21:03:33.574539 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:33.574201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" event={"ID":"27a09a7a-b398-4891-a309-5aadec642aba","Type":"ContainerDied","Data":"6e45aa14c45143c34cfc1adb5de964c391cecf185bb90cfbae2affb46d09c736"} Apr 16 21:03:40.597798 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:40.597766 2579 generic.go:358] "Generic (PLEG): container finished" podID="27a09a7a-b398-4891-a309-5aadec642aba" containerID="02c25c2ee6bc3cacf9f142d7e6cc3a6c51ea0efe38f59c38046b6fa1d2151e76" exitCode=0 Apr 16 21:03:40.598175 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:40.597856 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" event={"ID":"27a09a7a-b398-4891-a309-5aadec642aba","Type":"ContainerDied","Data":"02c25c2ee6bc3cacf9f142d7e6cc3a6c51ea0efe38f59c38046b6fa1d2151e76"} Apr 16 21:03:41.719533 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.719503 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:41.799143 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.799112 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-bundle\") pod \"27a09a7a-b398-4891-a309-5aadec642aba\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " Apr 16 21:03:41.799304 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.799173 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-util\") pod \"27a09a7a-b398-4891-a309-5aadec642aba\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " Apr 16 21:03:41.799304 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.799209 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsrbb\" (UniqueName: \"kubernetes.io/projected/27a09a7a-b398-4891-a309-5aadec642aba-kube-api-access-hsrbb\") pod \"27a09a7a-b398-4891-a309-5aadec642aba\" (UID: \"27a09a7a-b398-4891-a309-5aadec642aba\") " Apr 16 21:03:41.799736 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.799709 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-bundle" (OuterVolumeSpecName: "bundle") pod "27a09a7a-b398-4891-a309-5aadec642aba" (UID: "27a09a7a-b398-4891-a309-5aadec642aba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:03:41.801540 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.801505 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a09a7a-b398-4891-a309-5aadec642aba-kube-api-access-hsrbb" (OuterVolumeSpecName: "kube-api-access-hsrbb") pod "27a09a7a-b398-4891-a309-5aadec642aba" (UID: "27a09a7a-b398-4891-a309-5aadec642aba"). InnerVolumeSpecName "kube-api-access-hsrbb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:03:41.803188 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.803161 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-util" (OuterVolumeSpecName: "util") pod "27a09a7a-b398-4891-a309-5aadec642aba" (UID: "27a09a7a-b398-4891-a309-5aadec642aba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:03:41.900478 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.900417 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:03:41.900478 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.900440 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hsrbb\" (UniqueName: \"kubernetes.io/projected/27a09a7a-b398-4891-a309-5aadec642aba-kube-api-access-hsrbb\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:03:41.900478 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:41.900450 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27a09a7a-b398-4891-a309-5aadec642aba-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:03:42.605256 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:42.605219 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" event={"ID":"27a09a7a-b398-4891-a309-5aadec642aba","Type":"ContainerDied","Data":"c049e5c8739e21f7a8160527a54317b0a1b9be020e6b3eefd96891da1dbb2850"} Apr 16 21:03:42.605256 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:42.605257 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c049e5c8739e21f7a8160527a54317b0a1b9be020e6b3eefd96891da1dbb2850" Apr 16 21:03:42.605527 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:42.605275 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5smmt4" Apr 16 21:03:46.161116 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.161085 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f"] Apr 16 21:03:46.161483 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.161381 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27a09a7a-b398-4891-a309-5aadec642aba" containerName="extract" Apr 16 21:03:46.161483 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.161391 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a09a7a-b398-4891-a309-5aadec642aba" containerName="extract" Apr 16 21:03:46.161483 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.161403 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27a09a7a-b398-4891-a309-5aadec642aba" containerName="pull" Apr 16 21:03:46.161483 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.161408 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a09a7a-b398-4891-a309-5aadec642aba" containerName="pull" Apr 16 21:03:46.161483 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.161418 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27a09a7a-b398-4891-a309-5aadec642aba" containerName="util" Apr 16 21:03:46.161483 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.161423 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a09a7a-b398-4891-a309-5aadec642aba" containerName="util" Apr 16 21:03:46.161483 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.161475 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="27a09a7a-b398-4891-a309-5aadec642aba" containerName="extract" Apr 16 21:03:46.193850 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.193819 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f"] Apr 16 21:03:46.194022 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.193928 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" Apr 16 21:03:46.197084 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.197065 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:03:46.197192 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.197064 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-6krwj\"" Apr 16 21:03:46.197192 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.197064 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 21:03:46.237362 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.237325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ee52be0-3bff-4f44-8875-35e1d7e2fce0-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-6zk2f\" (UID: \"8ee52be0-3bff-4f44-8875-35e1d7e2fce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" Apr 16 21:03:46.237362 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.237373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf487\" (UniqueName: \"kubernetes.io/projected/8ee52be0-3bff-4f44-8875-35e1d7e2fce0-kube-api-access-rf487\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-6zk2f\" (UID: \"8ee52be0-3bff-4f44-8875-35e1d7e2fce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" Apr 16 21:03:46.338643 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.338611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ee52be0-3bff-4f44-8875-35e1d7e2fce0-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-6zk2f\" (UID: \"8ee52be0-3bff-4f44-8875-35e1d7e2fce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" Apr 16 21:03:46.338782 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.338655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rf487\" (UniqueName: \"kubernetes.io/projected/8ee52be0-3bff-4f44-8875-35e1d7e2fce0-kube-api-access-rf487\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-6zk2f\" (UID: \"8ee52be0-3bff-4f44-8875-35e1d7e2fce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" Apr 16 21:03:46.339025 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.338978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ee52be0-3bff-4f44-8875-35e1d7e2fce0-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-6zk2f\" (UID: \"8ee52be0-3bff-4f44-8875-35e1d7e2fce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" Apr 16 21:03:46.348110 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.348087 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf487\" (UniqueName: \"kubernetes.io/projected/8ee52be0-3bff-4f44-8875-35e1d7e2fce0-kube-api-access-rf487\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-6zk2f\" (UID: \"8ee52be0-3bff-4f44-8875-35e1d7e2fce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" Apr 16 21:03:46.503020 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.502969 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" Apr 16 21:03:46.626258 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:46.626224 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f"] Apr 16 21:03:46.630551 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:03:46.630507 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee52be0_3bff_4f44_8875_35e1d7e2fce0.slice/crio-09efc803c0c67e82a5c3c4f848760517f908f4d7ccd1f1a9ea807699bb61ebb6 WatchSource:0}: Error finding container 09efc803c0c67e82a5c3c4f848760517f908f4d7ccd1f1a9ea807699bb61ebb6: Status 404 returned error can't find the container with id 09efc803c0c67e82a5c3c4f848760517f908f4d7ccd1f1a9ea807699bb61ebb6 Apr 16 21:03:47.620632 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:47.620600 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" event={"ID":"8ee52be0-3bff-4f44-8875-35e1d7e2fce0","Type":"ContainerStarted","Data":"09efc803c0c67e82a5c3c4f848760517f908f4d7ccd1f1a9ea807699bb61ebb6"} Apr 16 21:03:49.628912 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:49.628832 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" event={"ID":"8ee52be0-3bff-4f44-8875-35e1d7e2fce0","Type":"ContainerStarted","Data":"73b4000916e0cb2e0338ab1cf77f1df145a2fe1c9669bf168a43e415bce98eb2"} Apr 16 21:03:49.650550 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:49.650499 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-6zk2f" podStartSLOduration=1.001656913 podStartE2EDuration="3.650481627s" podCreationTimestamp="2026-04-16 21:03:46 +0000 UTC" firstStartedPulling="2026-04-16 21:03:46.632424263 +0000 UTC m=+329.650735089" lastFinishedPulling="2026-04-16 21:03:49.281248975 +0000 UTC m=+332.299559803" observedRunningTime="2026-04-16 21:03:49.648379871 +0000 UTC m=+332.666690716" watchObservedRunningTime="2026-04-16 21:03:49.650481627 +0000 UTC m=+332.668792475" Apr 16 21:03:50.605910 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.605878 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5"] Apr 16 21:03:50.609729 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.609712 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.612345 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.612317 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 21:03:50.613674 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.613651 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 21:03:50.613768 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.613725 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qvf78\"" Apr 16 21:03:50.619189 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.619167 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5"] Apr 16 21:03:50.679122 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.679085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.679514 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.679161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p59mw\" (UniqueName: \"kubernetes.io/projected/7beabb93-d3a3-4fb6-81e4-e576506fac19-kube-api-access-p59mw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.679514 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.679205 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.780359 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.780311 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.780527 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.780380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p59mw\" (UniqueName: \"kubernetes.io/projected/7beabb93-d3a3-4fb6-81e4-e576506fac19-kube-api-access-p59mw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.780527 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.780420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.780687 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.780665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.780795 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.780776 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.789292 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.789268 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p59mw\" (UniqueName: \"kubernetes.io/projected/7beabb93-d3a3-4fb6-81e4-e576506fac19-kube-api-access-p59mw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:50.920361 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:50.920280 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:51.095398 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:51.095373 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5"] Apr 16 21:03:51.097957 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:03:51.097930 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7beabb93_d3a3_4fb6_81e4_e576506fac19.slice/crio-1778f2ce5196b74991e3a6b97a5cfdea956ec12e972b68e66537455cbab6096b WatchSource:0}: Error finding container 1778f2ce5196b74991e3a6b97a5cfdea956ec12e972b68e66537455cbab6096b: Status 404 returned error can't find the container with id 1778f2ce5196b74991e3a6b97a5cfdea956ec12e972b68e66537455cbab6096b Apr 16 21:03:51.636579 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:51.636543 2579 generic.go:358] "Generic (PLEG): container finished" podID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerID="dd5cef67be7d46d74791a97ae6691e2e3c6e037d716ef9023fbea5a1debd10f5" exitCode=0 Apr 16 21:03:51.636759 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:51.636638 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" event={"ID":"7beabb93-d3a3-4fb6-81e4-e576506fac19","Type":"ContainerDied","Data":"dd5cef67be7d46d74791a97ae6691e2e3c6e037d716ef9023fbea5a1debd10f5"} Apr 16 21:03:51.636759 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:51.636675 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" event={"ID":"7beabb93-d3a3-4fb6-81e4-e576506fac19","Type":"ContainerStarted","Data":"1778f2ce5196b74991e3a6b97a5cfdea956ec12e972b68e66537455cbab6096b"} Apr 16 21:03:52.628719 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.628679 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2v8rr"] Apr 16 21:03:52.632303 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.632272 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:52.635185 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.635139 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 21:03:52.635318 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.635165 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 21:03:52.636283 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.636263 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bd8g9\"" Apr 16 21:03:52.641797 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.641768 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2v8rr"] Apr 16 21:03:52.697458 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.697423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxfb\" (UniqueName: \"kubernetes.io/projected/1893ec80-b46a-4960-b294-f331eef4529b-kube-api-access-mnxfb\") pod \"cert-manager-webhook-597b96b99b-2v8rr\" (UID: \"1893ec80-b46a-4960-b294-f331eef4529b\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:52.697621 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.697482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1893ec80-b46a-4960-b294-f331eef4529b-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2v8rr\" (UID: \"1893ec80-b46a-4960-b294-f331eef4529b\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:52.798839 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.798807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1893ec80-b46a-4960-b294-f331eef4529b-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2v8rr\" (UID: \"1893ec80-b46a-4960-b294-f331eef4529b\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:52.798982 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.798905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxfb\" (UniqueName: \"kubernetes.io/projected/1893ec80-b46a-4960-b294-f331eef4529b-kube-api-access-mnxfb\") pod \"cert-manager-webhook-597b96b99b-2v8rr\" (UID: \"1893ec80-b46a-4960-b294-f331eef4529b\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:52.807777 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.807748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1893ec80-b46a-4960-b294-f331eef4529b-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2v8rr\" (UID: \"1893ec80-b46a-4960-b294-f331eef4529b\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:52.808060 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.808039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxfb\" (UniqueName: \"kubernetes.io/projected/1893ec80-b46a-4960-b294-f331eef4529b-kube-api-access-mnxfb\") pod \"cert-manager-webhook-597b96b99b-2v8rr\" (UID: \"1893ec80-b46a-4960-b294-f331eef4529b\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:52.952634 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:52.952552 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:53.725910 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:53.725733 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2v8rr"] Apr 16 21:03:53.728833 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:03:53.728800 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1893ec80_b46a_4960_b294_f331eef4529b.slice/crio-527237a577f624edbb8a88286dae62b9115de105aa33850c2337875ab2fb0303 WatchSource:0}: Error finding container 527237a577f624edbb8a88286dae62b9115de105aa33850c2337875ab2fb0303: Status 404 returned error can't find the container with id 527237a577f624edbb8a88286dae62b9115de105aa33850c2337875ab2fb0303 Apr 16 21:03:54.649110 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:54.649014 2579 generic.go:358] "Generic (PLEG): container finished" podID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerID="38233490bfb59b53a9733f2af6a5a69bbc39b2bfe735bbf725fc66a974a44bb5" exitCode=0 Apr 16 21:03:54.649275 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:54.649094 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" event={"ID":"7beabb93-d3a3-4fb6-81e4-e576506fac19","Type":"ContainerDied","Data":"38233490bfb59b53a9733f2af6a5a69bbc39b2bfe735bbf725fc66a974a44bb5"} Apr 16 21:03:54.650387 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:54.650359 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" event={"ID":"1893ec80-b46a-4960-b294-f331eef4529b","Type":"ContainerStarted","Data":"527237a577f624edbb8a88286dae62b9115de105aa33850c2337875ab2fb0303"} Apr 16 21:03:55.659128 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:55.659090 2579 generic.go:358] "Generic (PLEG): container finished" podID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerID="19e3213f1d859c0c316e6253e3e0e8b76503f50ea4b8b8b64a9dfeb6e67912f1" exitCode=0 Apr 16 21:03:55.659570 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:55.659150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" event={"ID":"7beabb93-d3a3-4fb6-81e4-e576506fac19","Type":"ContainerDied","Data":"19e3213f1d859c0c316e6253e3e0e8b76503f50ea4b8b8b64a9dfeb6e67912f1"} Apr 16 21:03:56.346542 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.346507 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wzwml"] Apr 16 21:03:56.350040 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.350013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" Apr 16 21:03:56.353351 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.353329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-j4cfs\"" Apr 16 21:03:56.373149 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.373124 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wzwml"] Apr 16 21:03:56.434078 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.434038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c64f581-e242-4a1b-b479-b0b2c8f3b2bf-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wzwml\" (UID: \"1c64f581-e242-4a1b-b479-b0b2c8f3b2bf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" Apr 16 21:03:56.434244 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.434119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xwc\" (UniqueName: \"kubernetes.io/projected/1c64f581-e242-4a1b-b479-b0b2c8f3b2bf-kube-api-access-t7xwc\") pod \"cert-manager-cainjector-8966b78d4-wzwml\" (UID: \"1c64f581-e242-4a1b-b479-b0b2c8f3b2bf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" Apr 16 21:03:56.535404 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.535369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c64f581-e242-4a1b-b479-b0b2c8f3b2bf-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wzwml\" (UID: \"1c64f581-e242-4a1b-b479-b0b2c8f3b2bf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" Apr 16 21:03:56.535592 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.535444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xwc\" (UniqueName: \"kubernetes.io/projected/1c64f581-e242-4a1b-b479-b0b2c8f3b2bf-kube-api-access-t7xwc\") pod \"cert-manager-cainjector-8966b78d4-wzwml\" (UID: \"1c64f581-e242-4a1b-b479-b0b2c8f3b2bf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" Apr 16 21:03:56.546288 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.546253 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c64f581-e242-4a1b-b479-b0b2c8f3b2bf-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wzwml\" (UID: \"1c64f581-e242-4a1b-b479-b0b2c8f3b2bf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" Apr 16 21:03:56.546457 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.546392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xwc\" (UniqueName: \"kubernetes.io/projected/1c64f581-e242-4a1b-b479-b0b2c8f3b2bf-kube-api-access-t7xwc\") pod \"cert-manager-cainjector-8966b78d4-wzwml\" (UID: \"1c64f581-e242-4a1b-b479-b0b2c8f3b2bf\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" Apr 16 21:03:56.659761 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.659670 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" Apr 16 21:03:56.799414 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.799392 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:56.817527 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.817503 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wzwml"] Apr 16 21:03:56.820912 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:03:56.820873 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c64f581_e242_4a1b_b479_b0b2c8f3b2bf.slice/crio-92bdb7519022b49ae48e689bc0aa10c9cc895d9961bdc261d84e038392f10a12 WatchSource:0}: Error finding container 92bdb7519022b49ae48e689bc0aa10c9cc895d9961bdc261d84e038392f10a12: Status 404 returned error can't find the container with id 92bdb7519022b49ae48e689bc0aa10c9cc895d9961bdc261d84e038392f10a12 Apr 16 21:03:56.939844 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.939779 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-util\") pod \"7beabb93-d3a3-4fb6-81e4-e576506fac19\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " Apr 16 21:03:56.939844 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.939833 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p59mw\" (UniqueName: \"kubernetes.io/projected/7beabb93-d3a3-4fb6-81e4-e576506fac19-kube-api-access-p59mw\") pod \"7beabb93-d3a3-4fb6-81e4-e576506fac19\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " Apr 16 21:03:56.940030 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.939893 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-bundle\") pod \"7beabb93-d3a3-4fb6-81e4-e576506fac19\" (UID: \"7beabb93-d3a3-4fb6-81e4-e576506fac19\") " Apr 16 21:03:56.940323 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.940299 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-bundle" (OuterVolumeSpecName: "bundle") pod "7beabb93-d3a3-4fb6-81e4-e576506fac19" (UID: "7beabb93-d3a3-4fb6-81e4-e576506fac19"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:03:56.941980 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.941962 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7beabb93-d3a3-4fb6-81e4-e576506fac19-kube-api-access-p59mw" (OuterVolumeSpecName: "kube-api-access-p59mw") pod "7beabb93-d3a3-4fb6-81e4-e576506fac19" (UID: "7beabb93-d3a3-4fb6-81e4-e576506fac19"). InnerVolumeSpecName "kube-api-access-p59mw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:03:56.943679 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:56.943659 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-util" (OuterVolumeSpecName: "util") pod "7beabb93-d3a3-4fb6-81e4-e576506fac19" (UID: "7beabb93-d3a3-4fb6-81e4-e576506fac19"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:03:57.040731 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.040694 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p59mw\" (UniqueName: \"kubernetes.io/projected/7beabb93-d3a3-4fb6-81e4-e576506fac19-kube-api-access-p59mw\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:03:57.040731 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.040723 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:03:57.040731 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.040734 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7beabb93-d3a3-4fb6-81e4-e576506fac19-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:03:57.666800 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.666759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" event={"ID":"7beabb93-d3a3-4fb6-81e4-e576506fac19","Type":"ContainerDied","Data":"1778f2ce5196b74991e3a6b97a5cfdea956ec12e972b68e66537455cbab6096b"} Apr 16 21:03:57.666800 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.666789 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5rnm5" Apr 16 21:03:57.666800 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.666800 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1778f2ce5196b74991e3a6b97a5cfdea956ec12e972b68e66537455cbab6096b" Apr 16 21:03:57.668242 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.668210 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" event={"ID":"1893ec80-b46a-4960-b294-f331eef4529b","Type":"ContainerStarted","Data":"bc9e10e2222983c1c226bdb30cbae265607f484d34457854d6c14ed81a8ef1af"} Apr 16 21:03:57.668374 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.668342 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:03:57.669681 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.669658 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" event={"ID":"1c64f581-e242-4a1b-b479-b0b2c8f3b2bf","Type":"ContainerStarted","Data":"d1d5b90c780ff0faf20effcb32dac3b95d4c4dca80a19254f5c7423f7f39e018"} Apr 16 21:03:57.669681 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.669683 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" event={"ID":"1c64f581-e242-4a1b-b479-b0b2c8f3b2bf","Type":"ContainerStarted","Data":"92bdb7519022b49ae48e689bc0aa10c9cc895d9961bdc261d84e038392f10a12"} Apr 16 21:03:57.687515 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.687471 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" podStartSLOduration=2.691916587 podStartE2EDuration="5.687458949s" podCreationTimestamp="2026-04-16 21:03:52 +0000 UTC" firstStartedPulling="2026-04-16 21:03:53.731169037 +0000 UTC m=+336.749479862" lastFinishedPulling="2026-04-16 21:03:56.726711374 +0000 UTC m=+339.745022224" observedRunningTime="2026-04-16 21:03:57.686760984 +0000 UTC m=+340.705071832" watchObservedRunningTime="2026-04-16 21:03:57.687458949 +0000 UTC m=+340.705769795" Apr 16 21:03:57.707354 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:03:57.707313 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-wzwml" podStartSLOduration=1.707303192 podStartE2EDuration="1.707303192s" podCreationTimestamp="2026-04-16 21:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:03:57.706525564 +0000 UTC m=+340.724836412" watchObservedRunningTime="2026-04-16 21:03:57.707303192 +0000 UTC m=+340.725614039" Apr 16 21:04:03.675283 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:03.675209 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2v8rr" Apr 16 21:04:11.101905 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.101872 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj"] Apr 16 21:04:11.102343 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.102205 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerName="util" Apr 16 21:04:11.102343 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.102216 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerName="util" Apr 16 21:04:11.102343 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.102228 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerName="extract" Apr 16 21:04:11.102343 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.102234 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerName="extract" Apr 16 21:04:11.102343 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.102255 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerName="pull" Apr 16 21:04:11.102343 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.102262 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerName="pull" Apr 16 21:04:11.102343 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.102326 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="7beabb93-d3a3-4fb6-81e4-e576506fac19" containerName="extract" Apr 16 21:04:11.131957 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.131930 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj"] Apr 16 21:04:11.132120 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.132057 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.135488 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.135461 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 21:04:11.135619 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.135464 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qvf78\"" Apr 16 21:04:11.136399 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.136382 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 21:04:11.256557 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.256528 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.256697 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.256577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g54ml\" (UniqueName: \"kubernetes.io/projected/5389f687-3be4-4d0d-ad0c-34a2b091400f-kube-api-access-g54ml\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.256697 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.256599 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.357475 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.357390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.357475 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.357444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g54ml\" (UniqueName: \"kubernetes.io/projected/5389f687-3be4-4d0d-ad0c-34a2b091400f-kube-api-access-g54ml\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.357475 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.357467 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.357775 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.357753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.357841 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.357782 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.366480 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.366448 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g54ml\" (UniqueName: \"kubernetes.io/projected/5389f687-3be4-4d0d-ad0c-34a2b091400f-kube-api-access-g54ml\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.442525 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.442501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:11.566563 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:04:11.566521 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5389f687_3be4_4d0d_ad0c_34a2b091400f.slice/crio-ac0c9d6f3e08f1a3b9baa6466e9c21b805e669e4e7a201c62312caff39759415 WatchSource:0}: Error finding container ac0c9d6f3e08f1a3b9baa6466e9c21b805e669e4e7a201c62312caff39759415: Status 404 returned error can't find the container with id ac0c9d6f3e08f1a3b9baa6466e9c21b805e669e4e7a201c62312caff39759415 Apr 16 21:04:11.567532 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.567509 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj"] Apr 16 21:04:11.716545 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.716514 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" event={"ID":"5389f687-3be4-4d0d-ad0c-34a2b091400f","Type":"ContainerStarted","Data":"00d5bcbe992e13cb4b1566f83191362e7178d85346a9b2a14ff4a55b9c7bb238"} Apr 16 21:04:11.716656 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:11.716551 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" event={"ID":"5389f687-3be4-4d0d-ad0c-34a2b091400f","Type":"ContainerStarted","Data":"ac0c9d6f3e08f1a3b9baa6466e9c21b805e669e4e7a201c62312caff39759415"} Apr 16 21:04:12.721724 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:12.721694 2579 generic.go:358] "Generic (PLEG): container finished" podID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerID="00d5bcbe992e13cb4b1566f83191362e7178d85346a9b2a14ff4a55b9c7bb238" exitCode=0 Apr 16 21:04:12.722183 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:12.721789 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" event={"ID":"5389f687-3be4-4d0d-ad0c-34a2b091400f","Type":"ContainerDied","Data":"00d5bcbe992e13cb4b1566f83191362e7178d85346a9b2a14ff4a55b9c7bb238"} Apr 16 21:04:13.730894 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:13.730861 2579 generic.go:358] "Generic (PLEG): container finished" podID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerID="276b6412ff715baf7b368c903deb51e9d1d1df9f628fc8bdfa13e5b5072b8bbf" exitCode=0 Apr 16 21:04:13.731408 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:13.730917 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" event={"ID":"5389f687-3be4-4d0d-ad0c-34a2b091400f","Type":"ContainerDied","Data":"276b6412ff715baf7b368c903deb51e9d1d1df9f628fc8bdfa13e5b5072b8bbf"} Apr 16 21:04:14.736237 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:14.736201 2579 generic.go:358] "Generic (PLEG): container finished" podID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerID="f1ba2697daf92124b3dcea4ff2c7bedc700c5b616e29933ed016813fa8b66471" exitCode=0 Apr 16 21:04:14.736604 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:14.736285 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" event={"ID":"5389f687-3be4-4d0d-ad0c-34a2b091400f","Type":"ContainerDied","Data":"f1ba2697daf92124b3dcea4ff2c7bedc700c5b616e29933ed016813fa8b66471"} Apr 16 21:04:15.866765 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:15.866744 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:15.995953 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:15.995873 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g54ml\" (UniqueName: \"kubernetes.io/projected/5389f687-3be4-4d0d-ad0c-34a2b091400f-kube-api-access-g54ml\") pod \"5389f687-3be4-4d0d-ad0c-34a2b091400f\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " Apr 16 21:04:15.996143 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:15.995971 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-bundle\") pod \"5389f687-3be4-4d0d-ad0c-34a2b091400f\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " Apr 16 21:04:15.996143 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:15.996015 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-util\") pod \"5389f687-3be4-4d0d-ad0c-34a2b091400f\" (UID: \"5389f687-3be4-4d0d-ad0c-34a2b091400f\") " Apr 16 21:04:15.996641 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:15.996611 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-bundle" (OuterVolumeSpecName: "bundle") pod "5389f687-3be4-4d0d-ad0c-34a2b091400f" (UID: "5389f687-3be4-4d0d-ad0c-34a2b091400f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:15.998160 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:15.998136 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5389f687-3be4-4d0d-ad0c-34a2b091400f-kube-api-access-g54ml" (OuterVolumeSpecName: "kube-api-access-g54ml") pod "5389f687-3be4-4d0d-ad0c-34a2b091400f" (UID: "5389f687-3be4-4d0d-ad0c-34a2b091400f"). InnerVolumeSpecName "kube-api-access-g54ml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:04:16.001273 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:16.001247 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-util" (OuterVolumeSpecName: "util") pod "5389f687-3be4-4d0d-ad0c-34a2b091400f" (UID: "5389f687-3be4-4d0d-ad0c-34a2b091400f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:16.096762 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:16.096735 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:16.096762 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:16.096758 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5389f687-3be4-4d0d-ad0c-34a2b091400f-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:16.096896 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:16.096770 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g54ml\" (UniqueName: \"kubernetes.io/projected/5389f687-3be4-4d0d-ad0c-34a2b091400f-kube-api-access-g54ml\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:16.744273 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:16.744238 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" event={"ID":"5389f687-3be4-4d0d-ad0c-34a2b091400f","Type":"ContainerDied","Data":"ac0c9d6f3e08f1a3b9baa6466e9c21b805e669e4e7a201c62312caff39759415"} Apr 16 21:04:16.744273 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:16.744271 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac0c9d6f3e08f1a3b9baa6466e9c21b805e669e4e7a201c62312caff39759415" Apr 16 21:04:16.744471 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:16.744302 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5xdjtj" Apr 16 21:04:20.505623 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.505585 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797"] Apr 16 21:04:20.505975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.505878 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerName="util" Apr 16 21:04:20.505975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.505891 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerName="util" Apr 16 21:04:20.505975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.505907 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerName="pull" Apr 16 21:04:20.505975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.505912 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerName="pull" Apr 16 21:04:20.505975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.505925 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerName="extract" Apr 16 21:04:20.505975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.505931 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerName="extract" Apr 16 21:04:20.506221 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.505979 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5389f687-3be4-4d0d-ad0c-34a2b091400f" containerName="extract" Apr 16 21:04:20.511844 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.511828 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.514685 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.514663 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 21:04:20.515942 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.515925 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 21:04:20.516044 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.515943 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qvf78\"" Apr 16 21:04:20.522331 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.522302 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797"] Apr 16 21:04:20.632384 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.632341 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.632574 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.632401 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdd9r\" (UniqueName: \"kubernetes.io/projected/8d60dc12-6e26-40b9-af32-ebada3457030-kube-api-access-zdd9r\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.632574 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.632489 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.733662 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.733611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdd9r\" (UniqueName: \"kubernetes.io/projected/8d60dc12-6e26-40b9-af32-ebada3457030-kube-api-access-zdd9r\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.733823 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.733686 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.733823 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.733756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.734123 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.734099 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.734191 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.734119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.742920 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.742898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdd9r\" (UniqueName: \"kubernetes.io/projected/8d60dc12-6e26-40b9-af32-ebada3457030-kube-api-access-zdd9r\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.821975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.821904 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:20.958024 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:20.957923 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797"] Apr 16 21:04:20.960138 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:04:20.960111 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d60dc12_6e26_40b9_af32_ebada3457030.slice/crio-f4240802a5ee7911d1cdced0e77075a8ebecb4eb6a2df818fea76bb5717b7738 WatchSource:0}: Error finding container f4240802a5ee7911d1cdced0e77075a8ebecb4eb6a2df818fea76bb5717b7738: Status 404 returned error can't find the container with id f4240802a5ee7911d1cdced0e77075a8ebecb4eb6a2df818fea76bb5717b7738 Apr 16 21:04:21.762488 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:21.762448 2579 generic.go:358] "Generic (PLEG): container finished" podID="8d60dc12-6e26-40b9-af32-ebada3457030" containerID="0b0573e2f3cb987d9ccbd6806750acc22daab362e1867e85d5bbc21276ad85ce" exitCode=0 Apr 16 21:04:21.762877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:21.762536 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" event={"ID":"8d60dc12-6e26-40b9-af32-ebada3457030","Type":"ContainerDied","Data":"0b0573e2f3cb987d9ccbd6806750acc22daab362e1867e85d5bbc21276ad85ce"} Apr 16 21:04:21.762877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:21.762574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" event={"ID":"8d60dc12-6e26-40b9-af32-ebada3457030","Type":"ContainerStarted","Data":"f4240802a5ee7911d1cdced0e77075a8ebecb4eb6a2df818fea76bb5717b7738"} Apr 16 21:04:22.700179 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.700141 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5"] Apr 16 21:04:22.703460 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.703436 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.706524 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.706498 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 21:04:22.706652 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.706569 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 21:04:22.706652 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.706623 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jjnvs\"" Apr 16 21:04:22.706769 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.706671 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 21:04:22.706919 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.706905 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 21:04:22.723786 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.723759 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5"] Apr 16 21:04:22.767569 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.767480 2579 generic.go:358] "Generic (PLEG): container finished" podID="8d60dc12-6e26-40b9-af32-ebada3457030" containerID="1c7cb8a1e964b090d3c7419f76cc725f06d4e682163b03a7ef375a04c73bc8a8" exitCode=0 Apr 16 21:04:22.767569 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.767528 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" event={"ID":"8d60dc12-6e26-40b9-af32-ebada3457030","Type":"ContainerDied","Data":"1c7cb8a1e964b090d3c7419f76cc725f06d4e682163b03a7ef375a04c73bc8a8"} Apr 16 21:04:22.851766 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.851739 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.851914 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.851802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.851914 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.851885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhwf9\" (UniqueName: \"kubernetes.io/projected/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-kube-api-access-qhwf9\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.953192 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.953166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.953359 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.953337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.953428 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.953404 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhwf9\" (UniqueName: \"kubernetes.io/projected/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-kube-api-access-qhwf9\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.955658 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.955639 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.955754 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.955686 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:22.962388 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:22.962368 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhwf9\" (UniqueName: \"kubernetes.io/projected/9d088b51-5307-4c1c-9ca5-b8e3a0e90470-kube-api-access-qhwf9\") pod \"opendatahub-operator-controller-manager-5f94c666bb-bt9s5\" (UID: \"9d088b51-5307-4c1c-9ca5-b8e3a0e90470\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:23.014268 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:23.014227 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:23.155033 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:23.155010 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5"] Apr 16 21:04:23.157574 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:04:23.157546 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d088b51_5307_4c1c_9ca5_b8e3a0e90470.slice/crio-7afbae3a84b2dbd9484b3853cb127581e8d369f505cb06f42d594ffe7d000fac WatchSource:0}: Error finding container 7afbae3a84b2dbd9484b3853cb127581e8d369f505cb06f42d594ffe7d000fac: Status 404 returned error can't find the container with id 7afbae3a84b2dbd9484b3853cb127581e8d369f505cb06f42d594ffe7d000fac Apr 16 21:04:23.777756 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:23.777694 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" event={"ID":"9d088b51-5307-4c1c-9ca5-b8e3a0e90470","Type":"ContainerStarted","Data":"7afbae3a84b2dbd9484b3853cb127581e8d369f505cb06f42d594ffe7d000fac"} Apr 16 21:04:23.781247 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:23.781215 2579 generic.go:358] "Generic (PLEG): container finished" podID="8d60dc12-6e26-40b9-af32-ebada3457030" containerID="0f42217f9e01f8fa981c4ea34fc9c5bacce7f7e6b796d85867925031c8747a15" exitCode=0 Apr 16 21:04:23.781391 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:23.781279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" event={"ID":"8d60dc12-6e26-40b9-af32-ebada3457030","Type":"ContainerDied","Data":"0f42217f9e01f8fa981c4ea34fc9c5bacce7f7e6b796d85867925031c8747a15"} Apr 16 21:04:25.720937 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.720915 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:25.773896 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.773871 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-util\") pod \"8d60dc12-6e26-40b9-af32-ebada3457030\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " Apr 16 21:04:25.774018 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.773912 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-bundle\") pod \"8d60dc12-6e26-40b9-af32-ebada3457030\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " Apr 16 21:04:25.774018 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.773959 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdd9r\" (UniqueName: \"kubernetes.io/projected/8d60dc12-6e26-40b9-af32-ebada3457030-kube-api-access-zdd9r\") pod \"8d60dc12-6e26-40b9-af32-ebada3457030\" (UID: \"8d60dc12-6e26-40b9-af32-ebada3457030\") " Apr 16 21:04:25.774745 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.774700 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-bundle" (OuterVolumeSpecName: "bundle") pod "8d60dc12-6e26-40b9-af32-ebada3457030" (UID: "8d60dc12-6e26-40b9-af32-ebada3457030"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:25.776145 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.776125 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d60dc12-6e26-40b9-af32-ebada3457030-kube-api-access-zdd9r" (OuterVolumeSpecName: "kube-api-access-zdd9r") pod "8d60dc12-6e26-40b9-af32-ebada3457030" (UID: "8d60dc12-6e26-40b9-af32-ebada3457030"). InnerVolumeSpecName "kube-api-access-zdd9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:04:25.779407 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.779376 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-util" (OuterVolumeSpecName: "util") pod "8d60dc12-6e26-40b9-af32-ebada3457030" (UID: "8d60dc12-6e26-40b9-af32-ebada3457030"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:25.790431 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.790414 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" Apr 16 21:04:25.790431 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.790423 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cl797" event={"ID":"8d60dc12-6e26-40b9-af32-ebada3457030","Type":"ContainerDied","Data":"f4240802a5ee7911d1cdced0e77075a8ebecb4eb6a2df818fea76bb5717b7738"} Apr 16 21:04:25.790551 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.790448 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4240802a5ee7911d1cdced0e77075a8ebecb4eb6a2df818fea76bb5717b7738" Apr 16 21:04:25.874713 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.874685 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:25.874713 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.874710 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d60dc12-6e26-40b9-af32-ebada3457030-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:25.874713 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:25.874719 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdd9r\" (UniqueName: \"kubernetes.io/projected/8d60dc12-6e26-40b9-af32-ebada3457030-kube-api-access-zdd9r\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:26.794847 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:26.794813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" event={"ID":"9d088b51-5307-4c1c-9ca5-b8e3a0e90470","Type":"ContainerStarted","Data":"7d0a235522c9cf57b7f8e6a78580924f175d10999c942388c4e4962650f38408"} Apr 16 21:04:26.795221 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:26.794878 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:37.800505 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:37.800477 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" Apr 16 21:04:37.835301 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:37.835252 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-bt9s5" podStartSLOduration=13.242122616 podStartE2EDuration="15.835236998s" podCreationTimestamp="2026-04-16 21:04:22 +0000 UTC" firstStartedPulling="2026-04-16 21:04:23.159461143 +0000 UTC m=+366.177771967" lastFinishedPulling="2026-04-16 21:04:25.752575523 +0000 UTC m=+368.770886349" observedRunningTime="2026-04-16 21:04:26.827090261 +0000 UTC m=+369.845401110" watchObservedRunningTime="2026-04-16 21:04:37.835236998 +0000 UTC m=+380.853547845" Apr 16 21:04:38.382100 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.382069 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps"] Apr 16 21:04:38.382378 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.382366 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d60dc12-6e26-40b9-af32-ebada3457030" containerName="pull" Apr 16 21:04:38.382422 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.382380 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d60dc12-6e26-40b9-af32-ebada3457030" containerName="pull" Apr 16 21:04:38.382422 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.382393 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d60dc12-6e26-40b9-af32-ebada3457030" containerName="util" Apr 16 21:04:38.382422 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.382398 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d60dc12-6e26-40b9-af32-ebada3457030" containerName="util" Apr 16 21:04:38.382422 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.382407 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d60dc12-6e26-40b9-af32-ebada3457030" containerName="extract" Apr 16 21:04:38.382422 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.382415 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d60dc12-6e26-40b9-af32-ebada3457030" containerName="extract" Apr 16 21:04:38.382560 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.382480 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d60dc12-6e26-40b9-af32-ebada3457030" containerName="extract" Apr 16 21:04:38.385431 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.385408 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.391577 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.391545 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 21:04:38.392960 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.392936 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 21:04:38.392960 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.392949 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:04:38.393128 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.392942 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-hdldg\"" Apr 16 21:04:38.393207 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.392947 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 21:04:38.393289 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.393212 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 21:04:38.396253 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.396230 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps"] Apr 16 21:04:38.471849 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.471812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73699555-7225-43bd-bf69-321bd6ebb6fd-cert\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.472037 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.471865 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/73699555-7225-43bd-bf69-321bd6ebb6fd-metrics-cert\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.472037 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.471925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28jj\" (UniqueName: \"kubernetes.io/projected/73699555-7225-43bd-bf69-321bd6ebb6fd-kube-api-access-h28jj\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.472123 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.472031 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/73699555-7225-43bd-bf69-321bd6ebb6fd-manager-config\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.572856 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.572825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/73699555-7225-43bd-bf69-321bd6ebb6fd-manager-config\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.573036 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.572864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73699555-7225-43bd-bf69-321bd6ebb6fd-cert\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.573036 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.572904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/73699555-7225-43bd-bf69-321bd6ebb6fd-metrics-cert\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.573036 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.572945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h28jj\" (UniqueName: \"kubernetes.io/projected/73699555-7225-43bd-bf69-321bd6ebb6fd-kube-api-access-h28jj\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.573556 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.573535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/73699555-7225-43bd-bf69-321bd6ebb6fd-manager-config\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.575415 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.575387 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73699555-7225-43bd-bf69-321bd6ebb6fd-cert\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.575564 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.575543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/73699555-7225-43bd-bf69-321bd6ebb6fd-metrics-cert\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.588813 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.588794 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28jj\" (UniqueName: \"kubernetes.io/projected/73699555-7225-43bd-bf69-321bd6ebb6fd-kube-api-access-h28jj\") pod \"lws-controller-manager-c5b769f8c-fm6ps\" (UID: \"73699555-7225-43bd-bf69-321bd6ebb6fd\") " pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.701104 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.701041 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:38.837631 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:38.837608 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps"] Apr 16 21:04:38.839222 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:04:38.839194 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73699555_7225_43bd_bf69_321bd6ebb6fd.slice/crio-3aa4ce0edc9e2151e6b9e1f092184fc1e129d82b0f01ad3dfe9eb82445b37065 WatchSource:0}: Error finding container 3aa4ce0edc9e2151e6b9e1f092184fc1e129d82b0f01ad3dfe9eb82445b37065: Status 404 returned error can't find the container with id 3aa4ce0edc9e2151e6b9e1f092184fc1e129d82b0f01ad3dfe9eb82445b37065 Apr 16 21:04:39.840242 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:39.840200 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" event={"ID":"73699555-7225-43bd-bf69-321bd6ebb6fd","Type":"ContainerStarted","Data":"3aa4ce0edc9e2151e6b9e1f092184fc1e129d82b0f01ad3dfe9eb82445b37065"} Apr 16 21:04:40.845604 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.845565 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" event={"ID":"73699555-7225-43bd-bf69-321bd6ebb6fd","Type":"ContainerStarted","Data":"8f16fdb5a894623042108bfb9cf93787e5d87c521ba6478bc361a37952cbee91"} Apr 16 21:04:40.846091 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.845626 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:40.892964 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.892933 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh"] Apr 16 21:04:40.896442 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.896418 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:40.899954 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.899934 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 21:04:40.900454 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.900438 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qvf78\"" Apr 16 21:04:40.901285 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.901266 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 21:04:40.906478 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.906438 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" podStartSLOduration=1.376937765 podStartE2EDuration="2.9064269s" podCreationTimestamp="2026-04-16 21:04:38 +0000 UTC" firstStartedPulling="2026-04-16 21:04:38.840929388 +0000 UTC m=+381.859240214" lastFinishedPulling="2026-04-16 21:04:40.370418521 +0000 UTC m=+383.388729349" observedRunningTime="2026-04-16 21:04:40.895814907 +0000 UTC m=+383.914125755" watchObservedRunningTime="2026-04-16 21:04:40.9064269 +0000 UTC m=+383.924737748" Apr 16 21:04:40.937276 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.937247 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh"] Apr 16 21:04:40.993145 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.993114 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:40.993315 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.993157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2m67\" (UniqueName: \"kubernetes.io/projected/f6f8cf75-a18c-45cb-a670-23651767b1bc-kube-api-access-g2m67\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:40.993315 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:40.993186 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:41.094339 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.094312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:41.094508 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.094354 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2m67\" (UniqueName: \"kubernetes.io/projected/f6f8cf75-a18c-45cb-a670-23651767b1bc-kube-api-access-g2m67\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:41.094508 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.094388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:41.094683 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.094667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:41.094756 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.094738 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:41.120611 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.120546 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2m67\" (UniqueName: \"kubernetes.io/projected/f6f8cf75-a18c-45cb-a670-23651767b1bc-kube-api-access-g2m67\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:41.206090 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.206053 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:41.337274 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.337250 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh"] Apr 16 21:04:41.339453 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:04:41.339424 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f8cf75_a18c_45cb_a670_23651767b1bc.slice/crio-ebd6216f738654da698e1f3cfdab94f269df33341112aaade2188f236c13004b WatchSource:0}: Error finding container ebd6216f738654da698e1f3cfdab94f269df33341112aaade2188f236c13004b: Status 404 returned error can't find the container with id ebd6216f738654da698e1f3cfdab94f269df33341112aaade2188f236c13004b Apr 16 21:04:41.850826 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.850788 2579 generic.go:358] "Generic (PLEG): container finished" podID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerID="35fcbf40d482fc3839d9184eca99e2b5abeaad649370b37ee44b79d5cb93e2c0" exitCode=0 Apr 16 21:04:41.851323 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.850877 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" event={"ID":"f6f8cf75-a18c-45cb-a670-23651767b1bc","Type":"ContainerDied","Data":"35fcbf40d482fc3839d9184eca99e2b5abeaad649370b37ee44b79d5cb93e2c0"} Apr 16 21:04:41.851323 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:41.850914 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" event={"ID":"f6f8cf75-a18c-45cb-a670-23651767b1bc","Type":"ContainerStarted","Data":"ebd6216f738654da698e1f3cfdab94f269df33341112aaade2188f236c13004b"} Apr 16 21:04:42.856763 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:42.856674 2579 generic.go:358] "Generic (PLEG): container finished" podID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerID="6bff676760bb8c31d1ff6332ed55b197932af124c124edbae78f4c7883beb89d" exitCode=0 Apr 16 21:04:42.856763 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:42.856730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" event={"ID":"f6f8cf75-a18c-45cb-a670-23651767b1bc","Type":"ContainerDied","Data":"6bff676760bb8c31d1ff6332ed55b197932af124c124edbae78f4c7883beb89d"} Apr 16 21:04:43.861886 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:43.861847 2579 generic.go:358] "Generic (PLEG): container finished" podID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerID="b1d1c6ad8e118f9140251a06c86f88c5e887ad9fbae1896a2c6e0529e2768469" exitCode=0 Apr 16 21:04:43.862281 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:43.861934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" event={"ID":"f6f8cf75-a18c-45cb-a670-23651767b1bc","Type":"ContainerDied","Data":"b1d1c6ad8e118f9140251a06c86f88c5e887ad9fbae1896a2c6e0529e2768469"} Apr 16 21:04:44.990725 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:44.990704 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:45.027960 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.027938 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2m67\" (UniqueName: \"kubernetes.io/projected/f6f8cf75-a18c-45cb-a670-23651767b1bc-kube-api-access-g2m67\") pod \"f6f8cf75-a18c-45cb-a670-23651767b1bc\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " Apr 16 21:04:45.028095 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.028028 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-bundle\") pod \"f6f8cf75-a18c-45cb-a670-23651767b1bc\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " Apr 16 21:04:45.028095 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.028063 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-util\") pod \"f6f8cf75-a18c-45cb-a670-23651767b1bc\" (UID: \"f6f8cf75-a18c-45cb-a670-23651767b1bc\") " Apr 16 21:04:45.028871 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.028848 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-bundle" (OuterVolumeSpecName: "bundle") pod "f6f8cf75-a18c-45cb-a670-23651767b1bc" (UID: "f6f8cf75-a18c-45cb-a670-23651767b1bc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:45.030128 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.030104 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f8cf75-a18c-45cb-a670-23651767b1bc-kube-api-access-g2m67" (OuterVolumeSpecName: "kube-api-access-g2m67") pod "f6f8cf75-a18c-45cb-a670-23651767b1bc" (UID: "f6f8cf75-a18c-45cb-a670-23651767b1bc"). InnerVolumeSpecName "kube-api-access-g2m67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:04:45.037102 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.037072 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-util" (OuterVolumeSpecName: "util") pod "f6f8cf75-a18c-45cb-a670-23651767b1bc" (UID: "f6f8cf75-a18c-45cb-a670-23651767b1bc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:45.129099 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.129044 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2m67\" (UniqueName: \"kubernetes.io/projected/f6f8cf75-a18c-45cb-a670-23651767b1bc-kube-api-access-g2m67\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:45.129099 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.129066 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:45.129099 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.129077 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6f8cf75-a18c-45cb-a670-23651767b1bc-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:45.871520 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.871432 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" event={"ID":"f6f8cf75-a18c-45cb-a670-23651767b1bc","Type":"ContainerDied","Data":"ebd6216f738654da698e1f3cfdab94f269df33341112aaade2188f236c13004b"} Apr 16 21:04:45.871520 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.871466 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebd6216f738654da698e1f3cfdab94f269df33341112aaade2188f236c13004b" Apr 16 21:04:45.871520 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:45.871481 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cq6lh" Apr 16 21:04:50.262422 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.262383 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6"] Apr 16 21:04:50.263403 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.263371 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerName="util" Apr 16 21:04:50.263403 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.263399 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerName="util" Apr 16 21:04:50.263564 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.263420 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerName="extract" Apr 16 21:04:50.263564 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.263430 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerName="extract" Apr 16 21:04:50.263564 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.263450 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerName="pull" Apr 16 21:04:50.263564 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.263458 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerName="pull" Apr 16 21:04:50.263771 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.263566 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6f8cf75-a18c-45cb-a670-23651767b1bc" containerName="extract" Apr 16 21:04:50.268515 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.268494 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.272676 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.272643 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 21:04:50.272794 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.272723 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 21:04:50.273823 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.273806 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qvf78\"" Apr 16 21:04:50.291917 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.291895 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6"] Apr 16 21:04:50.372713 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.372680 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.372878 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.372732 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.372878 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.372826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldsv\" (UniqueName: \"kubernetes.io/projected/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-kube-api-access-tldsv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.473958 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.473923 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.474124 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.474015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tldsv\" (UniqueName: \"kubernetes.io/projected/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-kube-api-access-tldsv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.474124 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.474042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.474336 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.474316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.474374 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.474348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.512939 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.512874 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldsv\" (UniqueName: \"kubernetes.io/projected/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-kube-api-access-tldsv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.578472 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.578437 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:50.729757 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.729733 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6"] Apr 16 21:04:50.732522 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:04:50.732479 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab1919b_ddda_451f_ad13_7a0d7f30d8fd.slice/crio-16ca8a6f5f507f5e03a5d4d04f9ad02f393f106e37849569ccce083d4a2ea59a WatchSource:0}: Error finding container 16ca8a6f5f507f5e03a5d4d04f9ad02f393f106e37849569ccce083d4a2ea59a: Status 404 returned error can't find the container with id 16ca8a6f5f507f5e03a5d4d04f9ad02f393f106e37849569ccce083d4a2ea59a Apr 16 21:04:50.889020 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.888974 2579 generic.go:358] "Generic (PLEG): container finished" podID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerID="b7d41ce9ff11aede0756d509cb983dba3ef236779665381321156f913ed41101" exitCode=0 Apr 16 21:04:50.889150 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.889057 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" event={"ID":"aab1919b-ddda-451f-ad13-7a0d7f30d8fd","Type":"ContainerDied","Data":"b7d41ce9ff11aede0756d509cb983dba3ef236779665381321156f913ed41101"} Apr 16 21:04:50.889150 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:50.889091 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" event={"ID":"aab1919b-ddda-451f-ad13-7a0d7f30d8fd","Type":"ContainerStarted","Data":"16ca8a6f5f507f5e03a5d4d04f9ad02f393f106e37849569ccce083d4a2ea59a"} Apr 16 21:04:51.853737 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:51.853653 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-c5b769f8c-fm6ps" Apr 16 21:04:51.895337 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:51.895303 2579 generic.go:358] "Generic (PLEG): container finished" podID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerID="c960a1183251c1d4697d34617a2d63d5930d587531f7fa195b72d9e25457b610" exitCode=0 Apr 16 21:04:51.895481 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:51.895352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" event={"ID":"aab1919b-ddda-451f-ad13-7a0d7f30d8fd","Type":"ContainerDied","Data":"c960a1183251c1d4697d34617a2d63d5930d587531f7fa195b72d9e25457b610"} Apr 16 21:04:52.901114 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:52.901083 2579 generic.go:358] "Generic (PLEG): container finished" podID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerID="967f84371c200af835babb2ced3f97c4e9f1bdb4e1624056f03d128297f461b2" exitCode=0 Apr 16 21:04:52.901473 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:52.901201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" event={"ID":"aab1919b-ddda-451f-ad13-7a0d7f30d8fd","Type":"ContainerDied","Data":"967f84371c200af835babb2ced3f97c4e9f1bdb4e1624056f03d128297f461b2"} Apr 16 21:04:54.023531 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.023508 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:04:54.105202 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.105173 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-util\") pod \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " Apr 16 21:04:54.105354 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.105214 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-bundle\") pod \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " Apr 16 21:04:54.105354 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.105255 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tldsv\" (UniqueName: \"kubernetes.io/projected/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-kube-api-access-tldsv\") pod \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\" (UID: \"aab1919b-ddda-451f-ad13-7a0d7f30d8fd\") " Apr 16 21:04:54.106177 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.106150 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-bundle" (OuterVolumeSpecName: "bundle") pod "aab1919b-ddda-451f-ad13-7a0d7f30d8fd" (UID: "aab1919b-ddda-451f-ad13-7a0d7f30d8fd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:54.107572 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.107543 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-kube-api-access-tldsv" (OuterVolumeSpecName: "kube-api-access-tldsv") pod "aab1919b-ddda-451f-ad13-7a0d7f30d8fd" (UID: "aab1919b-ddda-451f-ad13-7a0d7f30d8fd"). InnerVolumeSpecName "kube-api-access-tldsv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:04:54.110612 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.110578 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-util" (OuterVolumeSpecName: "util") pod "aab1919b-ddda-451f-ad13-7a0d7f30d8fd" (UID: "aab1919b-ddda-451f-ad13-7a0d7f30d8fd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:54.206789 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.206702 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:54.206789 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.206732 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:54.206789 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.206745 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tldsv\" (UniqueName: \"kubernetes.io/projected/aab1919b-ddda-451f-ad13-7a0d7f30d8fd-kube-api-access-tldsv\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:04:54.910175 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.910136 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" event={"ID":"aab1919b-ddda-451f-ad13-7a0d7f30d8fd","Type":"ContainerDied","Data":"16ca8a6f5f507f5e03a5d4d04f9ad02f393f106e37849569ccce083d4a2ea59a"} Apr 16 21:04:54.910175 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.910179 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16ca8a6f5f507f5e03a5d4d04f9ad02f393f106e37849569ccce083d4a2ea59a" Apr 16 21:04:54.910387 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:04:54.910157 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2v89n6" Apr 16 21:05:22.660403 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.660368 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n"] Apr 16 21:05:22.660853 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.660829 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerName="pull" Apr 16 21:05:22.660853 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.660844 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerName="pull" Apr 16 21:05:22.660924 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.660856 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerName="extract" Apr 16 21:05:22.660924 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.660864 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerName="extract" Apr 16 21:05:22.660924 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.660898 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerName="util" Apr 16 21:05:22.660924 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.660907 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerName="util" Apr 16 21:05:22.661088 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.661012 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="aab1919b-ddda-451f-ad13-7a0d7f30d8fd" containerName="extract" Apr 16 21:05:22.663928 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.663909 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.666861 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.666837 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-gvf44\"" Apr 16 21:05:22.667008 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.666896 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 21:05:22.675848 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.675824 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n"] Apr 16 21:05:22.749346 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.749487 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749363 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.749487 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749381 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.749487 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.749487 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749416 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.749487 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8fjh\" (UniqueName: \"kubernetes.io/projected/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-kube-api-access-t8fjh\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.749648 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.749648 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.749648 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.749576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850313 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850313 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850525 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850525 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850525 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850525 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850458 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850525 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850525 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850831 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8fjh\" (UniqueName: \"kubernetes.io/projected/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-kube-api-access-t8fjh\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850831 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850684 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.850831 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.850750 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.851084 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.851062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.851158 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.851142 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.851286 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.851267 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.852932 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.852903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.853205 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.853185 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.858897 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.858871 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8fjh\" (UniqueName: \"kubernetes.io/projected/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-kube-api-access-t8fjh\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.858978 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.858933 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n\" (UID: \"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:22.977136 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:22.977113 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:23.106088 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:23.106059 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n"] Apr 16 21:05:23.108058 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:05:23.108032 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63910eb9_4466_41f0_b9ee_6fe0d7ebfb0b.slice/crio-a6e5f71f4c76e730c19dc46982029e231fbc72c2bdd8244f6a908029baf9abdd WatchSource:0}: Error finding container a6e5f71f4c76e730c19dc46982029e231fbc72c2bdd8244f6a908029baf9abdd: Status 404 returned error can't find the container with id a6e5f71f4c76e730c19dc46982029e231fbc72c2bdd8244f6a908029baf9abdd Apr 16 21:05:24.010457 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:24.010416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" event={"ID":"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b","Type":"ContainerStarted","Data":"a6e5f71f4c76e730c19dc46982029e231fbc72c2bdd8244f6a908029baf9abdd"} Apr 16 21:05:25.938542 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:25.938494 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 21:05:25.938878 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:25.938583 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 21:05:25.938878 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:25.938614 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 21:05:26.019925 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:26.019889 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" event={"ID":"63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b","Type":"ContainerStarted","Data":"702b316537661d6bdba30eb140f83206976980f9c674ad953ee1098a63f073eb"} Apr 16 21:05:26.045476 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:26.045412 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" podStartSLOduration=1.217125857 podStartE2EDuration="4.045393117s" podCreationTimestamp="2026-04-16 21:05:22 +0000 UTC" firstStartedPulling="2026-04-16 21:05:23.109919339 +0000 UTC m=+426.128230164" lastFinishedPulling="2026-04-16 21:05:25.938186583 +0000 UTC m=+428.956497424" observedRunningTime="2026-04-16 21:05:26.040870389 +0000 UTC m=+429.059181237" watchObservedRunningTime="2026-04-16 21:05:26.045393117 +0000 UTC m=+429.063703965" Apr 16 21:05:26.977327 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:26.977288 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:26.981866 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:26.981841 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:27.023400 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:27.023368 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:27.024109 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:27.024092 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n" Apr 16 21:05:41.596459 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.596380 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qr96j"] Apr 16 21:05:41.599798 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.599778 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" Apr 16 21:05:41.602623 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.602600 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 21:05:41.603875 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.603856 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-ls9n5\"" Apr 16 21:05:41.603936 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.603856 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 21:05:41.608458 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.608370 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qr96j"] Apr 16 21:05:41.711831 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.711807 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w84fv\" (UniqueName: \"kubernetes.io/projected/7f4714d1-1250-427f-89aa-332f9ecbbfff-kube-api-access-w84fv\") pod \"kuadrant-operator-catalog-qr96j\" (UID: \"7f4714d1-1250-427f-89aa-332f9ecbbfff\") " pod="kuadrant-system/kuadrant-operator-catalog-qr96j" Apr 16 21:05:41.813188 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.813155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w84fv\" (UniqueName: \"kubernetes.io/projected/7f4714d1-1250-427f-89aa-332f9ecbbfff-kube-api-access-w84fv\") pod \"kuadrant-operator-catalog-qr96j\" (UID: \"7f4714d1-1250-427f-89aa-332f9ecbbfff\") " pod="kuadrant-system/kuadrant-operator-catalog-qr96j" Apr 16 21:05:41.821856 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.821826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w84fv\" (UniqueName: \"kubernetes.io/projected/7f4714d1-1250-427f-89aa-332f9ecbbfff-kube-api-access-w84fv\") pod \"kuadrant-operator-catalog-qr96j\" (UID: \"7f4714d1-1250-427f-89aa-332f9ecbbfff\") " pod="kuadrant-system/kuadrant-operator-catalog-qr96j" Apr 16 21:05:41.910139 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.910062 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" Apr 16 21:05:41.960829 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:41.960799 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qr96j"] Apr 16 21:05:42.034753 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.034632 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qr96j"] Apr 16 21:05:42.037413 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:05:42.037389 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f4714d1_1250_427f_89aa_332f9ecbbfff.slice/crio-31fa7e7fa0ffb98b2496db8ef7b11dfee24eb2c4a3f9a7375a062c7fd5211a07 WatchSource:0}: Error finding container 31fa7e7fa0ffb98b2496db8ef7b11dfee24eb2c4a3f9a7375a062c7fd5211a07: Status 404 returned error can't find the container with id 31fa7e7fa0ffb98b2496db8ef7b11dfee24eb2c4a3f9a7375a062c7fd5211a07 Apr 16 21:05:42.078801 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.078774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" event={"ID":"7f4714d1-1250-427f-89aa-332f9ecbbfff","Type":"ContainerStarted","Data":"31fa7e7fa0ffb98b2496db8ef7b11dfee24eb2c4a3f9a7375a062c7fd5211a07"} Apr 16 21:05:42.169130 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.169054 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-67xvg"] Apr 16 21:05:42.173112 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.173094 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:42.184167 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.184011 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-67xvg"] Apr 16 21:05:42.217239 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.217206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hb55\" (UniqueName: \"kubernetes.io/projected/26c1940d-abea-402d-87f0-74283dd29012-kube-api-access-2hb55\") pod \"kuadrant-operator-catalog-67xvg\" (UID: \"26c1940d-abea-402d-87f0-74283dd29012\") " pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:42.318485 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.318456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hb55\" (UniqueName: \"kubernetes.io/projected/26c1940d-abea-402d-87f0-74283dd29012-kube-api-access-2hb55\") pod \"kuadrant-operator-catalog-67xvg\" (UID: \"26c1940d-abea-402d-87f0-74283dd29012\") " pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:42.327454 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.327424 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hb55\" (UniqueName: \"kubernetes.io/projected/26c1940d-abea-402d-87f0-74283dd29012-kube-api-access-2hb55\") pod \"kuadrant-operator-catalog-67xvg\" (UID: \"26c1940d-abea-402d-87f0-74283dd29012\") " pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:42.488154 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.488111 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:42.609926 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:42.609900 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-67xvg"] Apr 16 21:05:42.611664 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:05:42.611637 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c1940d_abea_402d_87f0_74283dd29012.slice/crio-6fc047e88432565db8fd9ea4be04774a4f9d79d992826f01633e865850ab0390 WatchSource:0}: Error finding container 6fc047e88432565db8fd9ea4be04774a4f9d79d992826f01633e865850ab0390: Status 404 returned error can't find the container with id 6fc047e88432565db8fd9ea4be04774a4f9d79d992826f01633e865850ab0390 Apr 16 21:05:43.083800 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:43.083766 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" event={"ID":"26c1940d-abea-402d-87f0-74283dd29012","Type":"ContainerStarted","Data":"6fc047e88432565db8fd9ea4be04774a4f9d79d992826f01633e865850ab0390"} Apr 16 21:05:45.093068 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.093023 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" event={"ID":"7f4714d1-1250-427f-89aa-332f9ecbbfff","Type":"ContainerStarted","Data":"e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c"} Apr 16 21:05:45.093537 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.093106 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" podUID="7f4714d1-1250-427f-89aa-332f9ecbbfff" containerName="registry-server" containerID="cri-o://e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c" gracePeriod=2 Apr 16 21:05:45.094657 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.094631 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" event={"ID":"26c1940d-abea-402d-87f0-74283dd29012","Type":"ContainerStarted","Data":"091e92f0291f5f888496a5ba43c03e392d18c9889472fa1a879992eb518b9bc2"} Apr 16 21:05:45.109858 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.109821 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" podStartSLOduration=1.926785808 podStartE2EDuration="4.109809552s" podCreationTimestamp="2026-04-16 21:05:41 +0000 UTC" firstStartedPulling="2026-04-16 21:05:42.038954239 +0000 UTC m=+445.057265079" lastFinishedPulling="2026-04-16 21:05:44.221977992 +0000 UTC m=+447.240288823" observedRunningTime="2026-04-16 21:05:45.108477598 +0000 UTC m=+448.126788445" watchObservedRunningTime="2026-04-16 21:05:45.109809552 +0000 UTC m=+448.128120398" Apr 16 21:05:45.124950 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.124912 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" podStartSLOduration=1.513250866 podStartE2EDuration="3.124896317s" podCreationTimestamp="2026-04-16 21:05:42 +0000 UTC" firstStartedPulling="2026-04-16 21:05:42.613023697 +0000 UTC m=+445.631334522" lastFinishedPulling="2026-04-16 21:05:44.224669135 +0000 UTC m=+447.242979973" observedRunningTime="2026-04-16 21:05:45.122656036 +0000 UTC m=+448.140966907" watchObservedRunningTime="2026-04-16 21:05:45.124896317 +0000 UTC m=+448.143207164" Apr 16 21:05:45.325382 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.325357 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" Apr 16 21:05:45.445852 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.445750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w84fv\" (UniqueName: \"kubernetes.io/projected/7f4714d1-1250-427f-89aa-332f9ecbbfff-kube-api-access-w84fv\") pod \"7f4714d1-1250-427f-89aa-332f9ecbbfff\" (UID: \"7f4714d1-1250-427f-89aa-332f9ecbbfff\") " Apr 16 21:05:45.448328 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.448290 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4714d1-1250-427f-89aa-332f9ecbbfff-kube-api-access-w84fv" (OuterVolumeSpecName: "kube-api-access-w84fv") pod "7f4714d1-1250-427f-89aa-332f9ecbbfff" (UID: "7f4714d1-1250-427f-89aa-332f9ecbbfff"). InnerVolumeSpecName "kube-api-access-w84fv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:05:45.546848 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:45.546818 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w84fv\" (UniqueName: \"kubernetes.io/projected/7f4714d1-1250-427f-89aa-332f9ecbbfff-kube-api-access-w84fv\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:05:46.099712 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.099679 2579 generic.go:358] "Generic (PLEG): container finished" podID="7f4714d1-1250-427f-89aa-332f9ecbbfff" containerID="e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c" exitCode=0 Apr 16 21:05:46.100183 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.099735 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" Apr 16 21:05:46.100183 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.099765 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" event={"ID":"7f4714d1-1250-427f-89aa-332f9ecbbfff","Type":"ContainerDied","Data":"e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c"} Apr 16 21:05:46.100183 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.099805 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-qr96j" event={"ID":"7f4714d1-1250-427f-89aa-332f9ecbbfff","Type":"ContainerDied","Data":"31fa7e7fa0ffb98b2496db8ef7b11dfee24eb2c4a3f9a7375a062c7fd5211a07"} Apr 16 21:05:46.100183 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.099826 2579 scope.go:117] "RemoveContainer" containerID="e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c" Apr 16 21:05:46.108964 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.108955 2579 scope.go:117] "RemoveContainer" containerID="e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c" Apr 16 21:05:46.109254 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:05:46.109234 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c\": container with ID starting with e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c not found: ID does not exist" containerID="e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c" Apr 16 21:05:46.109303 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.109261 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c"} err="failed to get container status \"e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c\": rpc error: code = NotFound desc = could not find container \"e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c\": container with ID starting with e7312c00f05797387d12d595dda916dc717ed22cb57fdac7e762bf82e6c4860c not found: ID does not exist" Apr 16 21:05:46.117280 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.117256 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qr96j"] Apr 16 21:05:46.120895 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:46.120875 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-qr96j"] Apr 16 21:05:47.571879 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:47.571846 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4714d1-1250-427f-89aa-332f9ecbbfff" path="/var/lib/kubelet/pods/7f4714d1-1250-427f-89aa-332f9ecbbfff/volumes" Apr 16 21:05:52.488658 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:52.488616 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:52.488658 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:52.488667 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:52.510123 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:52.510099 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:53.146240 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:53.146215 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-67xvg" Apr 16 21:05:56.942299 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:56.942263 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt"] Apr 16 21:05:56.942710 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:56.942592 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f4714d1-1250-427f-89aa-332f9ecbbfff" containerName="registry-server" Apr 16 21:05:56.942710 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:56.942603 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4714d1-1250-427f-89aa-332f9ecbbfff" containerName="registry-server" Apr 16 21:05:56.942710 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:56.942668 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f4714d1-1250-427f-89aa-332f9ecbbfff" containerName="registry-server" Apr 16 21:05:56.947271 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:56.947253 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:56.950173 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:56.950154 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ljsfg\"" Apr 16 21:05:56.955581 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:56.955550 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt"] Apr 16 21:05:57.040839 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.040800 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhwqv\" (UniqueName: \"kubernetes.io/projected/928ad48f-db3c-450a-952c-5ab71bd07fd5-kube-api-access-bhwqv\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.040839 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.040846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.041082 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.040887 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.141503 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.141471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhwqv\" (UniqueName: \"kubernetes.io/projected/928ad48f-db3c-450a-952c-5ab71bd07fd5-kube-api-access-bhwqv\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.141503 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.141508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.141707 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.141636 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.141923 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.141903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.142033 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.142019 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.152479 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.152453 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhwqv\" (UniqueName: \"kubernetes.io/projected/928ad48f-db3c-450a-952c-5ab71bd07fd5-kube-api-access-bhwqv\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.258224 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.258200 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:05:57.405941 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:05:57.405904 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928ad48f_db3c_450a_952c_5ab71bd07fd5.slice/crio-69d371cd903a6b1c73dcf567bfa551fddd5630877edb5f0ffb22fe48c03569d7 WatchSource:0}: Error finding container 69d371cd903a6b1c73dcf567bfa551fddd5630877edb5f0ffb22fe48c03569d7: Status 404 returned error can't find the container with id 69d371cd903a6b1c73dcf567bfa551fddd5630877edb5f0ffb22fe48c03569d7 Apr 16 21:05:57.408794 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.408759 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt"] Apr 16 21:05:57.546313 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.546285 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2"] Apr 16 21:05:57.549694 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.549679 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.559090 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.559051 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2"] Apr 16 21:05:57.645780 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.645745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w27nq\" (UniqueName: \"kubernetes.io/projected/fff5c780-0c3c-4f84-ab04-1411873d94a8-kube-api-access-w27nq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.645970 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.645786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.645970 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.645812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.746509 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.746481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w27nq\" (UniqueName: \"kubernetes.io/projected/fff5c780-0c3c-4f84-ab04-1411873d94a8-kube-api-access-w27nq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.746642 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.746516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.746642 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.746532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.746886 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.746868 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.746886 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.746881 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.755772 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.755742 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w27nq\" (UniqueName: \"kubernetes.io/projected/fff5c780-0c3c-4f84-ab04-1411873d94a8-kube-api-access-w27nq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.875092 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.875001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:05:57.997488 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:57.997460 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2"] Apr 16 21:05:57.999121 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:05:57.999096 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff5c780_0c3c_4f84_ab04_1411873d94a8.slice/crio-ea7639b598d962ae74fd39518400b3e5419d13fd9ca18b1cda8c9eab85324d77 WatchSource:0}: Error finding container ea7639b598d962ae74fd39518400b3e5419d13fd9ca18b1cda8c9eab85324d77: Status 404 returned error can't find the container with id ea7639b598d962ae74fd39518400b3e5419d13fd9ca18b1cda8c9eab85324d77 Apr 16 21:05:58.143379 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.143298 2579 generic.go:358] "Generic (PLEG): container finished" podID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerID="6d666c2f76852e6c9f145a8044d8c604f4c299c8282a71cd39e866bc305f39e7" exitCode=0 Apr 16 21:05:58.143519 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.143392 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" event={"ID":"928ad48f-db3c-450a-952c-5ab71bd07fd5","Type":"ContainerDied","Data":"6d666c2f76852e6c9f145a8044d8c604f4c299c8282a71cd39e866bc305f39e7"} Apr 16 21:05:58.143519 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.143418 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" event={"ID":"928ad48f-db3c-450a-952c-5ab71bd07fd5","Type":"ContainerStarted","Data":"69d371cd903a6b1c73dcf567bfa551fddd5630877edb5f0ffb22fe48c03569d7"} Apr 16 21:05:58.151113 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.151065 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n"] Apr 16 21:05:58.152522 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.152491 2579 generic.go:358] "Generic (PLEG): container finished" podID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerID="df6c5de8d64bb37c8ecef82662041f280c9f7ccc99cebd89253dc34351db50c6" exitCode=0 Apr 16 21:05:58.154943 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.154923 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" event={"ID":"fff5c780-0c3c-4f84-ab04-1411873d94a8","Type":"ContainerDied","Data":"df6c5de8d64bb37c8ecef82662041f280c9f7ccc99cebd89253dc34351db50c6"} Apr 16 21:05:58.155026 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.154948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" event={"ID":"fff5c780-0c3c-4f84-ab04-1411873d94a8","Type":"ContainerStarted","Data":"ea7639b598d962ae74fd39518400b3e5419d13fd9ca18b1cda8c9eab85324d77"} Apr 16 21:05:58.155078 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.155048 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.160357 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.160337 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n"] Apr 16 21:05:58.250293 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.250262 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.250505 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.250490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.250579 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.250539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5bms\" (UniqueName: \"kubernetes.io/projected/cfd6eb27-4070-4d8b-a5f5-2785164f315d-kube-api-access-d5bms\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.351268 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.351241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.351371 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.351309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.351371 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.351337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5bms\" (UniqueName: \"kubernetes.io/projected/cfd6eb27-4070-4d8b-a5f5-2785164f315d-kube-api-access-d5bms\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.351577 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.351561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.351647 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.351629 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.360464 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.360439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5bms\" (UniqueName: \"kubernetes.io/projected/cfd6eb27-4070-4d8b-a5f5-2785164f315d-kube-api-access-d5bms\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.475485 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.475459 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:05:58.553701 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.551627 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp"] Apr 16 21:05:58.557333 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.557311 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.564420 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.564400 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp"] Apr 16 21:05:58.602060 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.602034 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n"] Apr 16 21:05:58.603334 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:05:58.603312 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd6eb27_4070_4d8b_a5f5_2785164f315d.slice/crio-149dc67ea7740309ca44cccbf470c39d964883e9b9839e9d39aea589330201e2 WatchSource:0}: Error finding container 149dc67ea7740309ca44cccbf470c39d964883e9b9839e9d39aea589330201e2: Status 404 returned error can't find the container with id 149dc67ea7740309ca44cccbf470c39d964883e9b9839e9d39aea589330201e2 Apr 16 21:05:58.654788 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.654761 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.654903 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.654816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6fqn\" (UniqueName: \"kubernetes.io/projected/a9997b85-3b6f-4d24-a986-35661e550af2-kube-api-access-s6fqn\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.654967 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.654907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.755584 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.755499 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.755755 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.755632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.755755 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.755678 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6fqn\" (UniqueName: \"kubernetes.io/projected/a9997b85-3b6f-4d24-a986-35661e550af2-kube-api-access-s6fqn\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.755890 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.755866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.755956 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.755905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.765677 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.765655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6fqn\" (UniqueName: \"kubernetes.io/projected/a9997b85-3b6f-4d24-a986-35661e550af2-kube-api-access-s6fqn\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:58.869453 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:58.869421 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:05:59.061925 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:59.061900 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp"] Apr 16 21:05:59.084227 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:05:59.084199 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9997b85_3b6f_4d24_a986_35661e550af2.slice/crio-969bb310b32224ded28180786cb7c1ebf2515413180c06b7039427231b645720 WatchSource:0}: Error finding container 969bb310b32224ded28180786cb7c1ebf2515413180c06b7039427231b645720: Status 404 returned error can't find the container with id 969bb310b32224ded28180786cb7c1ebf2515413180c06b7039427231b645720 Apr 16 21:05:59.157824 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:59.157799 2579 generic.go:358] "Generic (PLEG): container finished" podID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerID="80d48985cd1ee6e2de632b6e366ef152676448b10ad42e8d74a4f01a432e43b4" exitCode=0 Apr 16 21:05:59.157941 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:59.157892 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" event={"ID":"fff5c780-0c3c-4f84-ab04-1411873d94a8","Type":"ContainerDied","Data":"80d48985cd1ee6e2de632b6e366ef152676448b10ad42e8d74a4f01a432e43b4"} Apr 16 21:05:59.159253 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:59.159227 2579 generic.go:358] "Generic (PLEG): container finished" podID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerID="5b5b3baf1025343bfa210867972c97c19100a4915a476691f9fda9f5152ec2ed" exitCode=0 Apr 16 21:05:59.159358 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:59.159308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" event={"ID":"cfd6eb27-4070-4d8b-a5f5-2785164f315d","Type":"ContainerDied","Data":"5b5b3baf1025343bfa210867972c97c19100a4915a476691f9fda9f5152ec2ed"} Apr 16 21:05:59.159358 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:59.159342 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" event={"ID":"cfd6eb27-4070-4d8b-a5f5-2785164f315d","Type":"ContainerStarted","Data":"149dc67ea7740309ca44cccbf470c39d964883e9b9839e9d39aea589330201e2"} Apr 16 21:05:59.160527 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:05:59.160495 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" event={"ID":"a9997b85-3b6f-4d24-a986-35661e550af2","Type":"ContainerStarted","Data":"969bb310b32224ded28180786cb7c1ebf2515413180c06b7039427231b645720"} Apr 16 21:06:00.173022 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:00.172918 2579 generic.go:358] "Generic (PLEG): container finished" podID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerID="9e7a2c0648f606cd59f60c2fca4ef47a124035a936686ff0b2e7fbb95d07e96c" exitCode=0 Apr 16 21:06:00.173380 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:00.173021 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" event={"ID":"cfd6eb27-4070-4d8b-a5f5-2785164f315d","Type":"ContainerDied","Data":"9e7a2c0648f606cd59f60c2fca4ef47a124035a936686ff0b2e7fbb95d07e96c"} Apr 16 21:06:00.174413 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:00.174385 2579 generic.go:358] "Generic (PLEG): container finished" podID="a9997b85-3b6f-4d24-a986-35661e550af2" containerID="28280bb680531acc591065599b662df382daef5ef1b4a372d77147a40181cc63" exitCode=0 Apr 16 21:06:00.174513 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:00.174463 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" event={"ID":"a9997b85-3b6f-4d24-a986-35661e550af2","Type":"ContainerDied","Data":"28280bb680531acc591065599b662df382daef5ef1b4a372d77147a40181cc63"} Apr 16 21:06:00.176252 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:00.176229 2579 generic.go:358] "Generic (PLEG): container finished" podID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerID="e5a61a815b498f5185bb4ebb6c269fd1ff79ffef7057d9d3e6c67c3e3f8698fd" exitCode=0 Apr 16 21:06:00.176333 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:00.176272 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" event={"ID":"928ad48f-db3c-450a-952c-5ab71bd07fd5","Type":"ContainerDied","Data":"e5a61a815b498f5185bb4ebb6c269fd1ff79ffef7057d9d3e6c67c3e3f8698fd"} Apr 16 21:06:00.178494 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:00.178474 2579 generic.go:358] "Generic (PLEG): container finished" podID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerID="9e9299c71e99bd61cdd8a8443e1e7f60cc3fc48badeab0a08f8cd21acd2d30db" exitCode=0 Apr 16 21:06:00.178580 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:00.178503 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" event={"ID":"fff5c780-0c3c-4f84-ab04-1411873d94a8","Type":"ContainerDied","Data":"9e9299c71e99bd61cdd8a8443e1e7f60cc3fc48badeab0a08f8cd21acd2d30db"} Apr 16 21:06:01.183911 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.183820 2579 generic.go:358] "Generic (PLEG): container finished" podID="a9997b85-3b6f-4d24-a986-35661e550af2" containerID="305916fd54d9ac10d895a8426b59628f2b538e8ea025a33764cbc5b5d0345346" exitCode=0 Apr 16 21:06:01.184357 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.183956 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" event={"ID":"a9997b85-3b6f-4d24-a986-35661e550af2","Type":"ContainerDied","Data":"305916fd54d9ac10d895a8426b59628f2b538e8ea025a33764cbc5b5d0345346"} Apr 16 21:06:01.185904 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.185882 2579 generic.go:358] "Generic (PLEG): container finished" podID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerID="eef43e2c2e1bc80cbf6e4d8215876d62294720836f6923d261375f70d4852619" exitCode=0 Apr 16 21:06:01.186005 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.185968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" event={"ID":"928ad48f-db3c-450a-952c-5ab71bd07fd5","Type":"ContainerDied","Data":"eef43e2c2e1bc80cbf6e4d8215876d62294720836f6923d261375f70d4852619"} Apr 16 21:06:01.187700 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.187680 2579 generic.go:358] "Generic (PLEG): container finished" podID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerID="70510756153e4d744649be9419338d32a5a975d9a9ac41d456d8c4955f39ca7b" exitCode=0 Apr 16 21:06:01.187795 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.187739 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" event={"ID":"cfd6eb27-4070-4d8b-a5f5-2785164f315d","Type":"ContainerDied","Data":"70510756153e4d744649be9419338d32a5a975d9a9ac41d456d8c4955f39ca7b"} Apr 16 21:06:01.318504 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.318475 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:06:01.481901 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.481878 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-util\") pod \"fff5c780-0c3c-4f84-ab04-1411873d94a8\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " Apr 16 21:06:01.482117 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.481916 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-bundle\") pod \"fff5c780-0c3c-4f84-ab04-1411873d94a8\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " Apr 16 21:06:01.482117 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.482022 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w27nq\" (UniqueName: \"kubernetes.io/projected/fff5c780-0c3c-4f84-ab04-1411873d94a8-kube-api-access-w27nq\") pod \"fff5c780-0c3c-4f84-ab04-1411873d94a8\" (UID: \"fff5c780-0c3c-4f84-ab04-1411873d94a8\") " Apr 16 21:06:01.482430 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.482408 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-bundle" (OuterVolumeSpecName: "bundle") pod "fff5c780-0c3c-4f84-ab04-1411873d94a8" (UID: "fff5c780-0c3c-4f84-ab04-1411873d94a8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:01.484153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.484133 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff5c780-0c3c-4f84-ab04-1411873d94a8-kube-api-access-w27nq" (OuterVolumeSpecName: "kube-api-access-w27nq") pod "fff5c780-0c3c-4f84-ab04-1411873d94a8" (UID: "fff5c780-0c3c-4f84-ab04-1411873d94a8"). InnerVolumeSpecName "kube-api-access-w27nq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:01.487213 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.487178 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-util" (OuterVolumeSpecName: "util") pod "fff5c780-0c3c-4f84-ab04-1411873d94a8" (UID: "fff5c780-0c3c-4f84-ab04-1411873d94a8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:01.583379 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.583356 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:01.583379 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.583379 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fff5c780-0c3c-4f84-ab04-1411873d94a8-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:01.583531 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:01.583388 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w27nq\" (UniqueName: \"kubernetes.io/projected/fff5c780-0c3c-4f84-ab04-1411873d94a8-kube-api-access-w27nq\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:02.192927 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.192894 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" event={"ID":"fff5c780-0c3c-4f84-ab04-1411873d94a8","Type":"ContainerDied","Data":"ea7639b598d962ae74fd39518400b3e5419d13fd9ca18b1cda8c9eab85324d77"} Apr 16 21:06:02.192927 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.192932 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7639b598d962ae74fd39518400b3e5419d13fd9ca18b1cda8c9eab85324d77" Apr 16 21:06:02.193352 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.192902 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2" Apr 16 21:06:02.194829 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.194801 2579 generic.go:358] "Generic (PLEG): container finished" podID="a9997b85-3b6f-4d24-a986-35661e550af2" containerID="0d1930662870876887eb0e8828838429164afff43252fa5b2c68545a850c8a35" exitCode=0 Apr 16 21:06:02.194971 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.194932 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" event={"ID":"a9997b85-3b6f-4d24-a986-35661e550af2","Type":"ContainerDied","Data":"0d1930662870876887eb0e8828838429164afff43252fa5b2c68545a850c8a35"} Apr 16 21:06:02.323870 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.323849 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:06:02.355350 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.355328 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:06:02.490155 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.490127 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhwqv\" (UniqueName: \"kubernetes.io/projected/928ad48f-db3c-450a-952c-5ab71bd07fd5-kube-api-access-bhwqv\") pod \"928ad48f-db3c-450a-952c-5ab71bd07fd5\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " Apr 16 21:06:02.490316 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.490162 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-util\") pod \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " Apr 16 21:06:02.490316 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.490200 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5bms\" (UniqueName: \"kubernetes.io/projected/cfd6eb27-4070-4d8b-a5f5-2785164f315d-kube-api-access-d5bms\") pod \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " Apr 16 21:06:02.490316 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.490232 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-bundle\") pod \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\" (UID: \"cfd6eb27-4070-4d8b-a5f5-2785164f315d\") " Apr 16 21:06:02.490316 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.490281 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-bundle\") pod \"928ad48f-db3c-450a-952c-5ab71bd07fd5\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " Apr 16 21:06:02.490524 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.490338 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-util\") pod \"928ad48f-db3c-450a-952c-5ab71bd07fd5\" (UID: \"928ad48f-db3c-450a-952c-5ab71bd07fd5\") " Apr 16 21:06:02.490972 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.490932 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-bundle" (OuterVolumeSpecName: "bundle") pod "928ad48f-db3c-450a-952c-5ab71bd07fd5" (UID: "928ad48f-db3c-450a-952c-5ab71bd07fd5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:02.490972 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.490961 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-bundle" (OuterVolumeSpecName: "bundle") pod "cfd6eb27-4070-4d8b-a5f5-2785164f315d" (UID: "cfd6eb27-4070-4d8b-a5f5-2785164f315d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:02.492743 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.492717 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928ad48f-db3c-450a-952c-5ab71bd07fd5-kube-api-access-bhwqv" (OuterVolumeSpecName: "kube-api-access-bhwqv") pod "928ad48f-db3c-450a-952c-5ab71bd07fd5" (UID: "928ad48f-db3c-450a-952c-5ab71bd07fd5"). InnerVolumeSpecName "kube-api-access-bhwqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:02.492743 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.492726 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd6eb27-4070-4d8b-a5f5-2785164f315d-kube-api-access-d5bms" (OuterVolumeSpecName: "kube-api-access-d5bms") pod "cfd6eb27-4070-4d8b-a5f5-2785164f315d" (UID: "cfd6eb27-4070-4d8b-a5f5-2785164f315d"). InnerVolumeSpecName "kube-api-access-d5bms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:02.496456 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.496432 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-util" (OuterVolumeSpecName: "util") pod "cfd6eb27-4070-4d8b-a5f5-2785164f315d" (UID: "cfd6eb27-4070-4d8b-a5f5-2785164f315d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:02.496576 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.496558 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-util" (OuterVolumeSpecName: "util") pod "928ad48f-db3c-450a-952c-5ab71bd07fd5" (UID: "928ad48f-db3c-450a-952c-5ab71bd07fd5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:02.591743 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.591713 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:02.591743 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.591738 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhwqv\" (UniqueName: \"kubernetes.io/projected/928ad48f-db3c-450a-952c-5ab71bd07fd5-kube-api-access-bhwqv\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:02.591743 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.591747 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:02.591970 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.591759 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5bms\" (UniqueName: \"kubernetes.io/projected/cfd6eb27-4070-4d8b-a5f5-2785164f315d-kube-api-access-d5bms\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:02.591970 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.591767 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfd6eb27-4070-4d8b-a5f5-2785164f315d-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:02.591970 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:02.591775 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/928ad48f-db3c-450a-952c-5ab71bd07fd5-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:03.208472 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.208431 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" event={"ID":"cfd6eb27-4070-4d8b-a5f5-2785164f315d","Type":"ContainerDied","Data":"149dc67ea7740309ca44cccbf470c39d964883e9b9839e9d39aea589330201e2"} Apr 16 21:06:03.208472 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.208471 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="149dc67ea7740309ca44cccbf470c39d964883e9b9839e9d39aea589330201e2" Apr 16 21:06:03.208877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.208473 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n" Apr 16 21:06:03.210286 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.210250 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" event={"ID":"928ad48f-db3c-450a-952c-5ab71bd07fd5","Type":"ContainerDied","Data":"69d371cd903a6b1c73dcf567bfa551fddd5630877edb5f0ffb22fe48c03569d7"} Apr 16 21:06:03.210400 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.210290 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69d371cd903a6b1c73dcf567bfa551fddd5630877edb5f0ffb22fe48c03569d7" Apr 16 21:06:03.210443 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.210425 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt" Apr 16 21:06:03.333060 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.333038 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:06:03.497698 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.497670 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-util\") pod \"a9997b85-3b6f-4d24-a986-35661e550af2\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " Apr 16 21:06:03.497698 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.497700 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-bundle\") pod \"a9997b85-3b6f-4d24-a986-35661e550af2\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " Apr 16 21:06:03.497916 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.497754 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6fqn\" (UniqueName: \"kubernetes.io/projected/a9997b85-3b6f-4d24-a986-35661e550af2-kube-api-access-s6fqn\") pod \"a9997b85-3b6f-4d24-a986-35661e550af2\" (UID: \"a9997b85-3b6f-4d24-a986-35661e550af2\") " Apr 16 21:06:03.498365 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.498338 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-bundle" (OuterVolumeSpecName: "bundle") pod "a9997b85-3b6f-4d24-a986-35661e550af2" (UID: "a9997b85-3b6f-4d24-a986-35661e550af2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:03.500128 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.500104 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9997b85-3b6f-4d24-a986-35661e550af2-kube-api-access-s6fqn" (OuterVolumeSpecName: "kube-api-access-s6fqn") pod "a9997b85-3b6f-4d24-a986-35661e550af2" (UID: "a9997b85-3b6f-4d24-a986-35661e550af2"). InnerVolumeSpecName "kube-api-access-s6fqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:03.502544 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.502512 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-util" (OuterVolumeSpecName: "util") pod "a9997b85-3b6f-4d24-a986-35661e550af2" (UID: "a9997b85-3b6f-4d24-a986-35661e550af2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:03.598875 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.598844 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-util\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:03.598875 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.598871 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9997b85-3b6f-4d24-a986-35661e550af2-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:03.599078 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:03.598881 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6fqn\" (UniqueName: \"kubernetes.io/projected/a9997b85-3b6f-4d24-a986-35661e550af2-kube-api-access-s6fqn\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:04.215475 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:04.215438 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" Apr 16 21:06:04.215856 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:04.215440 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp" event={"ID":"a9997b85-3b6f-4d24-a986-35661e550af2","Type":"ContainerDied","Data":"969bb310b32224ded28180786cb7c1ebf2515413180c06b7039427231b645720"} Apr 16 21:06:04.215856 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:04.215560 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="969bb310b32224ded28180786cb7c1ebf2515413180c06b7039427231b645720" Apr 16 21:06:13.178837 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.178806 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf"] Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179149 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerName="pull" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179162 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerName="pull" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179173 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerName="util" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179179 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerName="util" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179187 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerName="extract" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179193 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerName="extract" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179205 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerName="pull" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179212 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerName="pull" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179218 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerName="util" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179223 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerName="util" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179228 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerName="pull" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179233 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerName="pull" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179239 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9997b85-3b6f-4d24-a986-35661e550af2" containerName="util" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179244 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9997b85-3b6f-4d24-a986-35661e550af2" containerName="util" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179251 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9997b85-3b6f-4d24-a986-35661e550af2" containerName="extract" Apr 16 21:06:13.179248 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179256 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9997b85-3b6f-4d24-a986-35661e550af2" containerName="extract" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179266 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerName="util" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179272 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerName="util" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179278 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerName="extract" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179282 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerName="extract" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179290 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerName="extract" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179295 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerName="extract" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179300 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9997b85-3b6f-4d24-a986-35661e550af2" containerName="pull" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179304 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9997b85-3b6f-4d24-a986-35661e550af2" containerName="pull" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179353 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="fff5c780-0c3c-4f84-ab04-1411873d94a8" containerName="extract" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179361 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9997b85-3b6f-4d24-a986-35661e550af2" containerName="extract" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179368 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfd6eb27-4070-4d8b-a5f5-2785164f315d" containerName="extract" Apr 16 21:06:13.179686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.179377 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="928ad48f-db3c-450a-952c-5ab71bd07fd5" containerName="extract" Apr 16 21:06:13.183817 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.183799 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" Apr 16 21:06:13.187180 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.187154 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 21:06:13.187261 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.187245 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-mf2dx\"" Apr 16 21:06:13.195230 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.195208 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf"] Apr 16 21:06:13.269029 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.268982 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7dmv\" (UniqueName: \"kubernetes.io/projected/3cccc6dd-4f27-4402-9422-626b6c4e0443-kube-api-access-w7dmv\") pod \"dns-operator-controller-manager-648d5c98bc-9stmf\" (UID: \"3cccc6dd-4f27-4402-9422-626b6c4e0443\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" Apr 16 21:06:13.369646 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.369605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7dmv\" (UniqueName: \"kubernetes.io/projected/3cccc6dd-4f27-4402-9422-626b6c4e0443-kube-api-access-w7dmv\") pod \"dns-operator-controller-manager-648d5c98bc-9stmf\" (UID: \"3cccc6dd-4f27-4402-9422-626b6c4e0443\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" Apr 16 21:06:13.382537 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.382511 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7dmv\" (UniqueName: \"kubernetes.io/projected/3cccc6dd-4f27-4402-9422-626b6c4e0443-kube-api-access-w7dmv\") pod \"dns-operator-controller-manager-648d5c98bc-9stmf\" (UID: \"3cccc6dd-4f27-4402-9422-626b6c4e0443\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" Apr 16 21:06:13.495130 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.495103 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" Apr 16 21:06:13.656931 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:13.656898 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf"] Apr 16 21:06:13.657504 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:06:13.657474 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cccc6dd_4f27_4402_9422_626b6c4e0443.slice/crio-ea3a9d1505936e59938fedeee7de3be0cdb4c909f135f0c7fe8e6dc79b2261d8 WatchSource:0}: Error finding container ea3a9d1505936e59938fedeee7de3be0cdb4c909f135f0c7fe8e6dc79b2261d8: Status 404 returned error can't find the container with id ea3a9d1505936e59938fedeee7de3be0cdb4c909f135f0c7fe8e6dc79b2261d8 Apr 16 21:06:14.253646 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:14.253614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" event={"ID":"3cccc6dd-4f27-4402-9422-626b6c4e0443","Type":"ContainerStarted","Data":"ea3a9d1505936e59938fedeee7de3be0cdb4c909f135f0c7fe8e6dc79b2261d8"} Apr 16 21:06:16.157967 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.157944 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b444578d9-bfcww"] Apr 16 21:06:16.161254 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.161233 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.175364 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.175342 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b444578d9-bfcww"] Apr 16 21:06:16.195130 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.195100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-console-config\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.195221 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.195131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-trusted-ca-bundle\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.195221 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.195157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/609a24b6-a299-4512-85fa-7e821fa695db-console-serving-cert\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.195290 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.195221 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-service-ca\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.195290 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.195256 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c28kr\" (UniqueName: \"kubernetes.io/projected/609a24b6-a299-4512-85fa-7e821fa695db-kube-api-access-c28kr\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.195290 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.195281 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/609a24b6-a299-4512-85fa-7e821fa695db-console-oauth-config\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.195379 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.195311 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-oauth-serving-cert\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.262877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.262842 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" event={"ID":"3cccc6dd-4f27-4402-9422-626b6c4e0443","Type":"ContainerStarted","Data":"dc02e43d3d36e8144ea41b741de1bd9274e3fddc30cd0d90cc3f39cea3f9d879"} Apr 16 21:06:16.263063 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.262926 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" Apr 16 21:06:16.282648 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.282604 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" podStartSLOduration=0.847242152 podStartE2EDuration="3.282591847s" podCreationTimestamp="2026-04-16 21:06:13 +0000 UTC" firstStartedPulling="2026-04-16 21:06:13.659606209 +0000 UTC m=+476.677917041" lastFinishedPulling="2026-04-16 21:06:16.094955897 +0000 UTC m=+479.113266736" observedRunningTime="2026-04-16 21:06:16.280961812 +0000 UTC m=+479.299272650" watchObservedRunningTime="2026-04-16 21:06:16.282591847 +0000 UTC m=+479.300902754" Apr 16 21:06:16.296013 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.295970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-oauth-serving-cert\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.296102 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.296062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-console-config\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.296102 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.296078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-trusted-ca-bundle\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.296207 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.296102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/609a24b6-a299-4512-85fa-7e821fa695db-console-serving-cert\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.296207 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.296126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-service-ca\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.296207 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.296149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c28kr\" (UniqueName: \"kubernetes.io/projected/609a24b6-a299-4512-85fa-7e821fa695db-kube-api-access-c28kr\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.296347 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.296312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/609a24b6-a299-4512-85fa-7e821fa695db-console-oauth-config\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.296741 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.296708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-oauth-serving-cert\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.296930 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.296905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-service-ca\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.297119 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.297096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-trusted-ca-bundle\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.297166 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.297142 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/609a24b6-a299-4512-85fa-7e821fa695db-console-config\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.298630 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.298610 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/609a24b6-a299-4512-85fa-7e821fa695db-console-oauth-config\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.298769 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.298751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/609a24b6-a299-4512-85fa-7e821fa695db-console-serving-cert\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.303960 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.303940 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c28kr\" (UniqueName: \"kubernetes.io/projected/609a24b6-a299-4512-85fa-7e821fa695db-kube-api-access-c28kr\") pod \"console-b444578d9-bfcww\" (UID: \"609a24b6-a299-4512-85fa-7e821fa695db\") " pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.470078 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.470040 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:16.595240 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:16.595213 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b444578d9-bfcww"] Apr 16 21:06:16.596330 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:06:16.596308 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod609a24b6_a299_4512_85fa_7e821fa695db.slice/crio-a4ec6dfb189601e31935f8be4f03145fd2a60b7f2343860a78819e48f853ee8b WatchSource:0}: Error finding container a4ec6dfb189601e31935f8be4f03145fd2a60b7f2343860a78819e48f853ee8b: Status 404 returned error can't find the container with id a4ec6dfb189601e31935f8be4f03145fd2a60b7f2343860a78819e48f853ee8b Apr 16 21:06:17.268140 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.268098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b444578d9-bfcww" event={"ID":"609a24b6-a299-4512-85fa-7e821fa695db","Type":"ContainerStarted","Data":"61f071a993f3043804401b86a1761f40074c230f60ab7f0cf9401fe6c41fe0bf"} Apr 16 21:06:17.268140 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.268140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b444578d9-bfcww" event={"ID":"609a24b6-a299-4512-85fa-7e821fa695db","Type":"ContainerStarted","Data":"a4ec6dfb189601e31935f8be4f03145fd2a60b7f2343860a78819e48f853ee8b"} Apr 16 21:06:17.316009 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.315950 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b444578d9-bfcww" podStartSLOduration=1.315935156 podStartE2EDuration="1.315935156s" podCreationTimestamp="2026-04-16 21:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:06:17.31343208 +0000 UTC m=+480.331742926" watchObservedRunningTime="2026-04-16 21:06:17.315935156 +0000 UTC m=+480.334246004" Apr 16 21:06:17.319400 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.319381 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-z8xrw"] Apr 16 21:06:17.322768 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.322754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" Apr 16 21:06:17.325698 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.325676 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-wndk6\"" Apr 16 21:06:17.337290 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.337269 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-z8xrw"] Apr 16 21:06:17.404538 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.404497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp76g\" (UniqueName: \"kubernetes.io/projected/2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e-kube-api-access-bp76g\") pod \"authorino-operator-657f44b778-z8xrw\" (UID: \"2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e\") " pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" Apr 16 21:06:17.505792 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.505761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp76g\" (UniqueName: \"kubernetes.io/projected/2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e-kube-api-access-bp76g\") pod \"authorino-operator-657f44b778-z8xrw\" (UID: \"2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e\") " pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" Apr 16 21:06:17.521834 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.521779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp76g\" (UniqueName: \"kubernetes.io/projected/2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e-kube-api-access-bp76g\") pod \"authorino-operator-657f44b778-z8xrw\" (UID: \"2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e\") " pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" Apr 16 21:06:17.636443 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.636415 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-wndk6\"" Apr 16 21:06:17.643931 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.643904 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" Apr 16 21:06:17.771219 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:17.771196 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-z8xrw"] Apr 16 21:06:17.772342 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:06:17.772281 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2657dcf5_7dc2_458d_aa1b_b2cbc0fb310e.slice/crio-b9efa620156781b2ea2d01377b38618a755d635480c07f72f7579814434ea09e WatchSource:0}: Error finding container b9efa620156781b2ea2d01377b38618a755d635480c07f72f7579814434ea09e: Status 404 returned error can't find the container with id b9efa620156781b2ea2d01377b38618a755d635480c07f72f7579814434ea09e Apr 16 21:06:18.273515 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:18.273478 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" event={"ID":"2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e","Type":"ContainerStarted","Data":"b9efa620156781b2ea2d01377b38618a755d635480c07f72f7579814434ea09e"} Apr 16 21:06:20.285818 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:20.285778 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" event={"ID":"2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e","Type":"ContainerStarted","Data":"c0bf9aa43ea818ac33781f4e0a22f901582a4725259d47d59a442e6e78545157"} Apr 16 21:06:20.286234 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:20.285847 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" Apr 16 21:06:20.313914 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:20.313833 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" podStartSLOduration=1.04302122 podStartE2EDuration="3.313819901s" podCreationTimestamp="2026-04-16 21:06:17 +0000 UTC" firstStartedPulling="2026-04-16 21:06:17.774543552 +0000 UTC m=+480.792854395" lastFinishedPulling="2026-04-16 21:06:20.045342249 +0000 UTC m=+483.063653076" observedRunningTime="2026-04-16 21:06:20.311905928 +0000 UTC m=+483.330216776" watchObservedRunningTime="2026-04-16 21:06:20.313819901 +0000 UTC m=+483.332130747" Apr 16 21:06:25.042877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.042841 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt"] Apr 16 21:06:25.046556 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.046540 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:25.049583 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.049563 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-x7c9h\"" Apr 16 21:06:25.060433 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.060412 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt"] Apr 16 21:06:25.076891 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.076871 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmvm5\" (UniqueName: \"kubernetes.io/projected/0c2a0c3d-c9cc-4270-b35f-0eff161d484e-kube-api-access-gmvm5\") pod \"limitador-operator-controller-manager-85c4996f8c-xmvxt\" (UID: \"0c2a0c3d-c9cc-4270-b35f-0eff161d484e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:25.177983 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.177942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmvm5\" (UniqueName: \"kubernetes.io/projected/0c2a0c3d-c9cc-4270-b35f-0eff161d484e-kube-api-access-gmvm5\") pod \"limitador-operator-controller-manager-85c4996f8c-xmvxt\" (UID: \"0c2a0c3d-c9cc-4270-b35f-0eff161d484e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:25.191602 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.191579 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmvm5\" (UniqueName: \"kubernetes.io/projected/0c2a0c3d-c9cc-4270-b35f-0eff161d484e-kube-api-access-gmvm5\") pod \"limitador-operator-controller-manager-85c4996f8c-xmvxt\" (UID: \"0c2a0c3d-c9cc-4270-b35f-0eff161d484e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:25.356300 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.356219 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:25.481309 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:25.481284 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt"] Apr 16 21:06:25.482970 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:06:25.482944 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c2a0c3d_c9cc_4270_b35f_0eff161d484e.slice/crio-aae7966925d8bd82e2a537c3117bedfb5ede4f4b7480b4780347b41da94f09d9 WatchSource:0}: Error finding container aae7966925d8bd82e2a537c3117bedfb5ede4f4b7480b4780347b41da94f09d9: Status 404 returned error can't find the container with id aae7966925d8bd82e2a537c3117bedfb5ede4f4b7480b4780347b41da94f09d9 Apr 16 21:06:26.311114 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:26.311036 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" event={"ID":"0c2a0c3d-c9cc-4270-b35f-0eff161d484e","Type":"ContainerStarted","Data":"aae7966925d8bd82e2a537c3117bedfb5ede4f4b7480b4780347b41da94f09d9"} Apr 16 21:06:26.470293 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:26.470250 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:26.470463 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:26.470330 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:26.474828 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:26.474808 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:27.270671 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:27.270650 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9stmf" Apr 16 21:06:27.316625 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:27.316593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" event={"ID":"0c2a0c3d-c9cc-4270-b35f-0eff161d484e","Type":"ContainerStarted","Data":"22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2"} Apr 16 21:06:27.320781 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:27.320756 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b444578d9-bfcww" Apr 16 21:06:27.338075 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:27.338023 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" podStartSLOduration=0.604568535 podStartE2EDuration="2.337974092s" podCreationTimestamp="2026-04-16 21:06:25 +0000 UTC" firstStartedPulling="2026-04-16 21:06:25.484864216 +0000 UTC m=+488.503175040" lastFinishedPulling="2026-04-16 21:06:27.218269766 +0000 UTC m=+490.236580597" observedRunningTime="2026-04-16 21:06:27.337231511 +0000 UTC m=+490.355542358" watchObservedRunningTime="2026-04-16 21:06:27.337974092 +0000 UTC m=+490.356284940" Apr 16 21:06:27.388773 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:27.388740 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68447f845-77mw9"] Apr 16 21:06:28.320760 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:28.320729 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:31.291419 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:31.291391 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-z8xrw" Apr 16 21:06:39.326515 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:39.326483 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:41.978824 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:41.978790 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt"] Apr 16 21:06:41.979249 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:41.979019 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" containerName="manager" containerID="cri-o://22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2" gracePeriod=2 Apr 16 21:06:42.011510 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.011483 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt"] Apr 16 21:06:42.021302 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.021281 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756"] Apr 16 21:06:42.021626 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.021613 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" containerName="manager" Apr 16 21:06:42.021691 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.021627 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" containerName="manager" Apr 16 21:06:42.021745 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.021693 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" containerName="manager" Apr 16 21:06:42.024838 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.024817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" Apr 16 21:06:42.038187 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.038160 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756"] Apr 16 21:06:42.126068 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.126036 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqktp\" (UniqueName: \"kubernetes.io/projected/c62edaed-5da9-40d8-b721-37d1746992e5-kube-api-access-hqktp\") pod \"limitador-operator-controller-manager-85c4996f8c-sv756\" (UID: \"c62edaed-5da9-40d8-b721-37d1746992e5\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" Apr 16 21:06:42.211611 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.211586 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:42.215385 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.215350 2579 status_manager.go:895] "Failed to get status for pod" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" err="pods \"limitador-operator-controller-manager-85c4996f8c-xmvxt\" is forbidden: User \"system:node:ip-10-0-139-17.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-17.ec2.internal' and this object" Apr 16 21:06:42.226905 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.226882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqktp\" (UniqueName: \"kubernetes.io/projected/c62edaed-5da9-40d8-b721-37d1746992e5-kube-api-access-hqktp\") pod \"limitador-operator-controller-manager-85c4996f8c-sv756\" (UID: \"c62edaed-5da9-40d8-b721-37d1746992e5\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" Apr 16 21:06:42.239899 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.239838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqktp\" (UniqueName: \"kubernetes.io/projected/c62edaed-5da9-40d8-b721-37d1746992e5-kube-api-access-hqktp\") pod \"limitador-operator-controller-manager-85c4996f8c-sv756\" (UID: \"c62edaed-5da9-40d8-b721-37d1746992e5\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" Apr 16 21:06:42.327631 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.327599 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmvm5\" (UniqueName: \"kubernetes.io/projected/0c2a0c3d-c9cc-4270-b35f-0eff161d484e-kube-api-access-gmvm5\") pod \"0c2a0c3d-c9cc-4270-b35f-0eff161d484e\" (UID: \"0c2a0c3d-c9cc-4270-b35f-0eff161d484e\") " Apr 16 21:06:42.329900 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.329869 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2a0c3d-c9cc-4270-b35f-0eff161d484e-kube-api-access-gmvm5" (OuterVolumeSpecName: "kube-api-access-gmvm5") pod "0c2a0c3d-c9cc-4270-b35f-0eff161d484e" (UID: "0c2a0c3d-c9cc-4270-b35f-0eff161d484e"). InnerVolumeSpecName "kube-api-access-gmvm5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:42.375448 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.375418 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" Apr 16 21:06:42.381026 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.380973 2579 generic.go:358] "Generic (PLEG): container finished" podID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" containerID="22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2" exitCode=0 Apr 16 21:06:42.381146 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.381047 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" Apr 16 21:06:42.381146 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.381081 2579 scope.go:117] "RemoveContainer" containerID="22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2" Apr 16 21:06:42.383744 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.383718 2579 status_manager.go:895] "Failed to get status for pod" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" err="pods \"limitador-operator-controller-manager-85c4996f8c-xmvxt\" is forbidden: User \"system:node:ip-10-0-139-17.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-17.ec2.internal' and this object" Apr 16 21:06:42.390954 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.390924 2579 scope.go:117] "RemoveContainer" containerID="22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2" Apr 16 21:06:42.391268 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:06:42.391246 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2\": container with ID starting with 22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2 not found: ID does not exist" containerID="22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2" Apr 16 21:06:42.391358 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.391274 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2"} err="failed to get container status \"22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2\": rpc error: code = NotFound desc = could not find container \"22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2\": container with ID starting with 22f6ccb08a82f32e7177c15d18b3b7961c00fcca9e8fefa5f07ff1bc34a4f1c2 not found: ID does not exist" Apr 16 21:06:42.391898 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.391879 2579 status_manager.go:895] "Failed to get status for pod" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" err="pods \"limitador-operator-controller-manager-85c4996f8c-xmvxt\" is forbidden: User \"system:node:ip-10-0-139-17.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-17.ec2.internal' and this object" Apr 16 21:06:42.428646 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.428611 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmvm5\" (UniqueName: \"kubernetes.io/projected/0c2a0c3d-c9cc-4270-b35f-0eff161d484e-kube-api-access-gmvm5\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:42.514458 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:42.514429 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756"] Apr 16 21:06:42.515178 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:06:42.515148 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc62edaed_5da9_40d8_b721_37d1746992e5.slice/crio-ef0be38e6130fe02e1bc459631ea1b2e2bd333d9347aeb6ac725f949a647c4cf WatchSource:0}: Error finding container ef0be38e6130fe02e1bc459631ea1b2e2bd333d9347aeb6ac725f949a647c4cf: Status 404 returned error can't find the container with id ef0be38e6130fe02e1bc459631ea1b2e2bd333d9347aeb6ac725f949a647c4cf Apr 16 21:06:43.387805 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:43.387766 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" event={"ID":"c62edaed-5da9-40d8-b721-37d1746992e5","Type":"ContainerStarted","Data":"9ba8d9b9d05b904370f241e328298476413b64dcb9dfa48a30e08f7bf519aecc"} Apr 16 21:06:43.387805 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:43.387807 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" event={"ID":"c62edaed-5da9-40d8-b721-37d1746992e5","Type":"ContainerStarted","Data":"ef0be38e6130fe02e1bc459631ea1b2e2bd333d9347aeb6ac725f949a647c4cf"} Apr 16 21:06:43.388310 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:43.387888 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" Apr 16 21:06:43.390484 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:43.390448 2579 status_manager.go:895] "Failed to get status for pod" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xmvxt" err="pods \"limitador-operator-controller-manager-85c4996f8c-xmvxt\" is forbidden: User \"system:node:ip-10-0-139-17.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-17.ec2.internal' and this object" Apr 16 21:06:43.416456 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:43.416405 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" podStartSLOduration=2.4163939389999998 podStartE2EDuration="2.416393939s" podCreationTimestamp="2026-04-16 21:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:06:43.414712913 +0000 UTC m=+506.433023760" watchObservedRunningTime="2026-04-16 21:06:43.416393939 +0000 UTC m=+506.434704785" Apr 16 21:06:43.568097 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:43.568060 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2a0c3d-c9cc-4270-b35f-0eff161d484e" path="/var/lib/kubelet/pods/0c2a0c3d-c9cc-4270-b35f-0eff161d484e/volumes" Apr 16 21:06:52.411073 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.411032 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68447f845-77mw9" podUID="6c59c9ca-1833-4c75-8988-ce2a2827e44b" containerName="console" containerID="cri-o://cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166" gracePeriod=15 Apr 16 21:06:52.664140 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.664087 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68447f845-77mw9_6c59c9ca-1833-4c75-8988-ce2a2827e44b/console/0.log" Apr 16 21:06:52.664240 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.664149 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68447f845-77mw9" Apr 16 21:06:52.822629 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.822598 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-config\") pod \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " Apr 16 21:06:52.822812 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.822645 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-trusted-ca-bundle\") pod \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " Apr 16 21:06:52.822812 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.822703 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-oauth-config\") pod \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " Apr 16 21:06:52.822812 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.822732 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76ldg\" (UniqueName: \"kubernetes.io/projected/6c59c9ca-1833-4c75-8988-ce2a2827e44b-kube-api-access-76ldg\") pod \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " Apr 16 21:06:52.822812 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.822750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-service-ca\") pod \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " Apr 16 21:06:52.822812 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.822769 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-serving-cert\") pod \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " Apr 16 21:06:52.822812 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.822804 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-oauth-serving-cert\") pod \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\" (UID: \"6c59c9ca-1833-4c75-8988-ce2a2827e44b\") " Apr 16 21:06:52.823186 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.823039 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-config" (OuterVolumeSpecName: "console-config") pod "6c59c9ca-1833-4c75-8988-ce2a2827e44b" (UID: "6c59c9ca-1833-4c75-8988-ce2a2827e44b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:06:52.823243 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.823191 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-service-ca" (OuterVolumeSpecName: "service-ca") pod "6c59c9ca-1833-4c75-8988-ce2a2827e44b" (UID: "6c59c9ca-1833-4c75-8988-ce2a2827e44b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:06:52.823243 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.823235 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6c59c9ca-1833-4c75-8988-ce2a2827e44b" (UID: "6c59c9ca-1833-4c75-8988-ce2a2827e44b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:06:52.823439 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.823408 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6c59c9ca-1833-4c75-8988-ce2a2827e44b" (UID: "6c59c9ca-1833-4c75-8988-ce2a2827e44b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:06:52.825319 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.825294 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c59c9ca-1833-4c75-8988-ce2a2827e44b-kube-api-access-76ldg" (OuterVolumeSpecName: "kube-api-access-76ldg") pod "6c59c9ca-1833-4c75-8988-ce2a2827e44b" (UID: "6c59c9ca-1833-4c75-8988-ce2a2827e44b"). InnerVolumeSpecName "kube-api-access-76ldg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:52.825319 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.825293 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6c59c9ca-1833-4c75-8988-ce2a2827e44b" (UID: "6c59c9ca-1833-4c75-8988-ce2a2827e44b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:06:52.825423 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.825337 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6c59c9ca-1833-4c75-8988-ce2a2827e44b" (UID: "6c59c9ca-1833-4c75-8988-ce2a2827e44b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:06:52.924190 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.924103 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-config\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:52.924190 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.924132 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-trusted-ca-bundle\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:52.924190 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.924143 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-oauth-config\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:52.924190 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.924153 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-76ldg\" (UniqueName: \"kubernetes.io/projected/6c59c9ca-1833-4c75-8988-ce2a2827e44b-kube-api-access-76ldg\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:52.924190 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.924162 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-service-ca\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:52.924190 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.924170 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c59c9ca-1833-4c75-8988-ce2a2827e44b-console-serving-cert\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:52.924190 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:52.924181 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c59c9ca-1833-4c75-8988-ce2a2827e44b-oauth-serving-cert\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:06:53.428057 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.428027 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68447f845-77mw9_6c59c9ca-1833-4c75-8988-ce2a2827e44b/console/0.log" Apr 16 21:06:53.428451 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.428072 2579 generic.go:358] "Generic (PLEG): container finished" podID="6c59c9ca-1833-4c75-8988-ce2a2827e44b" containerID="cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166" exitCode=2 Apr 16 21:06:53.428451 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.428111 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68447f845-77mw9" event={"ID":"6c59c9ca-1833-4c75-8988-ce2a2827e44b","Type":"ContainerDied","Data":"cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166"} Apr 16 21:06:53.428451 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.428148 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68447f845-77mw9" Apr 16 21:06:53.428451 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.428158 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68447f845-77mw9" event={"ID":"6c59c9ca-1833-4c75-8988-ce2a2827e44b","Type":"ContainerDied","Data":"45d971d245ba5ecc03a6a070b56a4621d8bdd15c72d7483df0e6bf12910500b4"} Apr 16 21:06:53.428451 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.428174 2579 scope.go:117] "RemoveContainer" containerID="cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166" Apr 16 21:06:53.437485 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.437467 2579 scope.go:117] "RemoveContainer" containerID="cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166" Apr 16 21:06:53.437727 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:06:53.437709 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166\": container with ID starting with cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166 not found: ID does not exist" containerID="cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166" Apr 16 21:06:53.437772 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.437738 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166"} err="failed to get container status \"cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166\": rpc error: code = NotFound desc = could not find container \"cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166\": container with ID starting with cd11d2c2b08a622ef23c2ed01c7e2d1533bd8cd13754a41b3c1dd965e1c89166 not found: ID does not exist" Apr 16 21:06:53.453113 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.453085 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68447f845-77mw9"] Apr 16 21:06:53.458905 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.458882 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68447f845-77mw9"] Apr 16 21:06:53.567127 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:53.567097 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c59c9ca-1833-4c75-8988-ce2a2827e44b" path="/var/lib/kubelet/pods/6c59c9ca-1833-4c75-8988-ce2a2827e44b/volumes" Apr 16 21:06:54.393916 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:06:54.393886 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sv756" Apr 16 21:07:10.674299 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.674085 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9"] Apr 16 21:07:10.674975 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.674952 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c59c9ca-1833-4c75-8988-ce2a2827e44b" containerName="console" Apr 16 21:07:10.675087 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.674979 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c59c9ca-1833-4c75-8988-ce2a2827e44b" containerName="console" Apr 16 21:07:10.675409 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.675393 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c59c9ca-1833-4c75-8988-ce2a2827e44b" containerName="console" Apr 16 21:07:10.680641 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.680614 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.683599 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.683575 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-c2v8p\"" Apr 16 21:07:10.692249 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.692202 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9"] Apr 16 21:07:10.780694 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780653 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53e43b49-7372-442d-99ef-4171fe78701b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.780889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.780889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780720 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdxq\" (UniqueName: \"kubernetes.io/projected/53e43b49-7372-442d-99ef-4171fe78701b-kube-api-access-9zdxq\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.780889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780739 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.780889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780762 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.780889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780798 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53e43b49-7372-442d-99ef-4171fe78701b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.780889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780843 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.780889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780877 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.781172 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.780901 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53e43b49-7372-442d-99ef-4171fe78701b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881490 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881660 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881660 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881536 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53e43b49-7372-442d-99ef-4171fe78701b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881660 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53e43b49-7372-442d-99ef-4171fe78701b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881660 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881660 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdxq\" (UniqueName: \"kubernetes.io/projected/53e43b49-7372-442d-99ef-4171fe78701b-kube-api-access-9zdxq\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881918 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881918 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881918 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881724 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53e43b49-7372-442d-99ef-4171fe78701b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.881918 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.881860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.882229 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.882179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.882308 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.882231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.882472 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.882451 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.882510 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.882488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53e43b49-7372-442d-99ef-4171fe78701b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.883886 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.883867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53e43b49-7372-442d-99ef-4171fe78701b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.884463 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.884446 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53e43b49-7372-442d-99ef-4171fe78701b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.890194 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.890169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53e43b49-7372-442d-99ef-4171fe78701b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.890628 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.890609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdxq\" (UniqueName: \"kubernetes.io/projected/53e43b49-7372-442d-99ef-4171fe78701b-kube-api-access-9zdxq\") pod \"maas-default-gateway-openshift-default-58b6f876-dcmj9\" (UID: \"53e43b49-7372-442d-99ef-4171fe78701b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:10.996181 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:10.996159 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:11.127843 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:11.127817 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9"] Apr 16 21:07:11.129715 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:07:11.129675 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e43b49_7372_442d_99ef_4171fe78701b.slice/crio-69d705d4ce5d56b17b6f577c1c8be135bb8a24e87d3a0d2377e1e8983f236bdc WatchSource:0}: Error finding container 69d705d4ce5d56b17b6f577c1c8be135bb8a24e87d3a0d2377e1e8983f236bdc: Status 404 returned error can't find the container with id 69d705d4ce5d56b17b6f577c1c8be135bb8a24e87d3a0d2377e1e8983f236bdc Apr 16 21:07:11.132095 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:11.132054 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 21:07:11.132171 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:11.132142 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 21:07:11.132209 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:11.132183 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 21:07:11.504660 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:11.504620 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" event={"ID":"53e43b49-7372-442d-99ef-4171fe78701b","Type":"ContainerStarted","Data":"b6c13a1885b461644b17b1af3b7400b7e1260258c4d40742608c6bf6f0eadfb6"} Apr 16 21:07:11.504660 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:11.504665 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" event={"ID":"53e43b49-7372-442d-99ef-4171fe78701b","Type":"ContainerStarted","Data":"69d705d4ce5d56b17b6f577c1c8be135bb8a24e87d3a0d2377e1e8983f236bdc"} Apr 16 21:07:11.526261 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:11.526210 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" podStartSLOduration=1.526196559 podStartE2EDuration="1.526196559s" podCreationTimestamp="2026-04-16 21:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:07:11.52379149 +0000 UTC m=+534.542102338" watchObservedRunningTime="2026-04-16 21:07:11.526196559 +0000 UTC m=+534.544507403" Apr 16 21:07:11.997057 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:11.997026 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:13.001602 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:13.001573 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:13.512543 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:13.512508 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:13.513356 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:13.513339 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-dcmj9" Apr 16 21:07:27.509665 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.509626 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-44z8j"] Apr 16 21:07:27.514611 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.514587 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" Apr 16 21:07:27.517452 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.517423 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-vstnh\"" Apr 16 21:07:27.520425 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.520399 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-44z8j"] Apr 16 21:07:27.622532 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.622485 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97mth\" (UniqueName: \"kubernetes.io/projected/5a78c858-5228-4b42-8991-7d3016e92e69-kube-api-access-97mth\") pod \"authorino-f99f4b5cd-44z8j\" (UID: \"5a78c858-5228-4b42-8991-7d3016e92e69\") " pod="kuadrant-system/authorino-f99f4b5cd-44z8j" Apr 16 21:07:27.723264 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.723228 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97mth\" (UniqueName: \"kubernetes.io/projected/5a78c858-5228-4b42-8991-7d3016e92e69-kube-api-access-97mth\") pod \"authorino-f99f4b5cd-44z8j\" (UID: \"5a78c858-5228-4b42-8991-7d3016e92e69\") " pod="kuadrant-system/authorino-f99f4b5cd-44z8j" Apr 16 21:07:27.725959 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.725936 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-ppczz"] Apr 16 21:07:27.729306 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.729291 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ppczz" Apr 16 21:07:27.736633 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.736499 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-ppczz"] Apr 16 21:07:27.740720 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.740695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97mth\" (UniqueName: \"kubernetes.io/projected/5a78c858-5228-4b42-8991-7d3016e92e69-kube-api-access-97mth\") pod \"authorino-f99f4b5cd-44z8j\" (UID: \"5a78c858-5228-4b42-8991-7d3016e92e69\") " pod="kuadrant-system/authorino-f99f4b5cd-44z8j" Apr 16 21:07:27.824070 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.823941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjgbb\" (UniqueName: \"kubernetes.io/projected/0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a-kube-api-access-zjgbb\") pod \"authorino-7498df8756-ppczz\" (UID: \"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a\") " pod="kuadrant-system/authorino-7498df8756-ppczz" Apr 16 21:07:27.825799 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.825780 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" Apr 16 21:07:27.924957 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.924917 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjgbb\" (UniqueName: \"kubernetes.io/projected/0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a-kube-api-access-zjgbb\") pod \"authorino-7498df8756-ppczz\" (UID: \"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a\") " pod="kuadrant-system/authorino-7498df8756-ppczz" Apr 16 21:07:27.934418 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.934391 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjgbb\" (UniqueName: \"kubernetes.io/projected/0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a-kube-api-access-zjgbb\") pod \"authorino-7498df8756-ppczz\" (UID: \"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a\") " pod="kuadrant-system/authorino-7498df8756-ppczz" Apr 16 21:07:27.958056 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:27.958033 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-44z8j"] Apr 16 21:07:27.959979 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:07:27.959947 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a78c858_5228_4b42_8991_7d3016e92e69.slice/crio-31fbd6510d11ebf0b488998915b331d65a36b4eb9f21e704b6715116fb2e5d2c WatchSource:0}: Error finding container 31fbd6510d11ebf0b488998915b331d65a36b4eb9f21e704b6715116fb2e5d2c: Status 404 returned error can't find the container with id 31fbd6510d11ebf0b488998915b331d65a36b4eb9f21e704b6715116fb2e5d2c Apr 16 21:07:28.040019 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:28.039976 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ppczz" Apr 16 21:07:28.161665 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:28.161643 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-ppczz"] Apr 16 21:07:28.163390 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:07:28.163363 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cfd0c5a_3939_4eb3_aca0_6cc4fec0f55a.slice/crio-05b12b860e5558fc113e89decb9e1eb1607c3266dc2fa3df5688bea72997e468 WatchSource:0}: Error finding container 05b12b860e5558fc113e89decb9e1eb1607c3266dc2fa3df5688bea72997e468: Status 404 returned error can't find the container with id 05b12b860e5558fc113e89decb9e1eb1607c3266dc2fa3df5688bea72997e468 Apr 16 21:07:28.581923 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:28.581877 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" event={"ID":"5a78c858-5228-4b42-8991-7d3016e92e69","Type":"ContainerStarted","Data":"31fbd6510d11ebf0b488998915b331d65a36b4eb9f21e704b6715116fb2e5d2c"} Apr 16 21:07:28.583806 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:28.583777 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ppczz" event={"ID":"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a","Type":"ContainerStarted","Data":"05b12b860e5558fc113e89decb9e1eb1607c3266dc2fa3df5688bea72997e468"} Apr 16 21:07:30.597484 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:30.596688 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" event={"ID":"5a78c858-5228-4b42-8991-7d3016e92e69","Type":"ContainerStarted","Data":"21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880"} Apr 16 21:07:30.599460 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:30.599431 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ppczz" event={"ID":"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a","Type":"ContainerStarted","Data":"e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4"} Apr 16 21:07:30.619101 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:30.619035 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" podStartSLOduration=1.106260039 podStartE2EDuration="3.6190174s" podCreationTimestamp="2026-04-16 21:07:27 +0000 UTC" firstStartedPulling="2026-04-16 21:07:27.961179643 +0000 UTC m=+550.979490469" lastFinishedPulling="2026-04-16 21:07:30.473937002 +0000 UTC m=+553.492247830" observedRunningTime="2026-04-16 21:07:30.617340371 +0000 UTC m=+553.635651219" watchObservedRunningTime="2026-04-16 21:07:30.6190174 +0000 UTC m=+553.637328248" Apr 16 21:07:30.636072 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:30.635603 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-ppczz" podStartSLOduration=1.338300778 podStartE2EDuration="3.635582117s" podCreationTimestamp="2026-04-16 21:07:27 +0000 UTC" firstStartedPulling="2026-04-16 21:07:28.16469289 +0000 UTC m=+551.183003714" lastFinishedPulling="2026-04-16 21:07:30.461974226 +0000 UTC m=+553.480285053" observedRunningTime="2026-04-16 21:07:30.634474064 +0000 UTC m=+553.652784911" watchObservedRunningTime="2026-04-16 21:07:30.635582117 +0000 UTC m=+553.653892965" Apr 16 21:07:30.668608 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:30.668573 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-44z8j"] Apr 16 21:07:32.607889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:32.607849 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" podUID="5a78c858-5228-4b42-8991-7d3016e92e69" containerName="authorino" containerID="cri-o://21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880" gracePeriod=30 Apr 16 21:07:32.848051 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:32.848027 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" Apr 16 21:07:32.969815 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:32.969784 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97mth\" (UniqueName: \"kubernetes.io/projected/5a78c858-5228-4b42-8991-7d3016e92e69-kube-api-access-97mth\") pod \"5a78c858-5228-4b42-8991-7d3016e92e69\" (UID: \"5a78c858-5228-4b42-8991-7d3016e92e69\") " Apr 16 21:07:32.972204 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:32.972170 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a78c858-5228-4b42-8991-7d3016e92e69-kube-api-access-97mth" (OuterVolumeSpecName: "kube-api-access-97mth") pod "5a78c858-5228-4b42-8991-7d3016e92e69" (UID: "5a78c858-5228-4b42-8991-7d3016e92e69"). InnerVolumeSpecName "kube-api-access-97mth". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:07:33.070514 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.070476 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-97mth\" (UniqueName: \"kubernetes.io/projected/5a78c858-5228-4b42-8991-7d3016e92e69-kube-api-access-97mth\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:07:33.612485 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.612448 2579 generic.go:358] "Generic (PLEG): container finished" podID="5a78c858-5228-4b42-8991-7d3016e92e69" containerID="21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880" exitCode=0 Apr 16 21:07:33.612897 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.612505 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" Apr 16 21:07:33.612897 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.612533 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" event={"ID":"5a78c858-5228-4b42-8991-7d3016e92e69","Type":"ContainerDied","Data":"21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880"} Apr 16 21:07:33.612897 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.612575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-44z8j" event={"ID":"5a78c858-5228-4b42-8991-7d3016e92e69","Type":"ContainerDied","Data":"31fbd6510d11ebf0b488998915b331d65a36b4eb9f21e704b6715116fb2e5d2c"} Apr 16 21:07:33.612897 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.612590 2579 scope.go:117] "RemoveContainer" containerID="21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880" Apr 16 21:07:33.621318 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.621295 2579 scope.go:117] "RemoveContainer" containerID="21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880" Apr 16 21:07:33.621619 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:07:33.621601 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880\": container with ID starting with 21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880 not found: ID does not exist" containerID="21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880" Apr 16 21:07:33.621687 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.621626 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880"} err="failed to get container status \"21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880\": rpc error: code = NotFound desc = could not find container \"21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880\": container with ID starting with 21d0511105181220ff0cca7c2f5c3049b0a93b67a9454ace044b5ac03fe3a880 not found: ID does not exist" Apr 16 21:07:33.631004 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.630969 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-44z8j"] Apr 16 21:07:33.634639 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:33.634615 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-44z8j"] Apr 16 21:07:35.567884 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:07:35.567850 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a78c858-5228-4b42-8991-7d3016e92e69" path="/var/lib/kubelet/pods/5a78c858-5228-4b42-8991-7d3016e92e69/volumes" Apr 16 21:08:00.192637 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.192596 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-qt5tx"] Apr 16 21:08:00.193054 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.192956 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a78c858-5228-4b42-8991-7d3016e92e69" containerName="authorino" Apr 16 21:08:00.193054 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.192967 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a78c858-5228-4b42-8991-7d3016e92e69" containerName="authorino" Apr 16 21:08:00.193130 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.193062 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a78c858-5228-4b42-8991-7d3016e92e69" containerName="authorino" Apr 16 21:08:00.197464 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.197445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" Apr 16 21:08:00.207037 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.207009 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-qt5tx"] Apr 16 21:08:00.216579 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.216552 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv767\" (UniqueName: \"kubernetes.io/projected/c75a165c-defc-466d-b323-1f93e3d0d9a8-kube-api-access-vv767\") pod \"authorino-8b475cf9f-qt5tx\" (UID: \"c75a165c-defc-466d-b323-1f93e3d0d9a8\") " pod="kuadrant-system/authorino-8b475cf9f-qt5tx" Apr 16 21:08:00.317426 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.317387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv767\" (UniqueName: \"kubernetes.io/projected/c75a165c-defc-466d-b323-1f93e3d0d9a8-kube-api-access-vv767\") pod \"authorino-8b475cf9f-qt5tx\" (UID: \"c75a165c-defc-466d-b323-1f93e3d0d9a8\") " pod="kuadrant-system/authorino-8b475cf9f-qt5tx" Apr 16 21:08:00.327346 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.327312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv767\" (UniqueName: \"kubernetes.io/projected/c75a165c-defc-466d-b323-1f93e3d0d9a8-kube-api-access-vv767\") pod \"authorino-8b475cf9f-qt5tx\" (UID: \"c75a165c-defc-466d-b323-1f93e3d0d9a8\") " pod="kuadrant-system/authorino-8b475cf9f-qt5tx" Apr 16 21:08:00.421371 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.421338 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-qt5tx"] Apr 16 21:08:00.421587 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.421571 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" Apr 16 21:08:00.447358 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.447293 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6dcdbc9cd4-94tz6"] Apr 16 21:08:00.453018 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.452897 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6dcdbc9cd4-94tz6" Apr 16 21:08:00.459142 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.459116 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6dcdbc9cd4-94tz6"] Apr 16 21:08:00.520376 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.520338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wck\" (UniqueName: \"kubernetes.io/projected/61ce2ed5-1933-4d5a-b0d8-7b9cec743214-kube-api-access-k5wck\") pod \"authorino-6dcdbc9cd4-94tz6\" (UID: \"61ce2ed5-1933-4d5a-b0d8-7b9cec743214\") " pod="kuadrant-system/authorino-6dcdbc9cd4-94tz6" Apr 16 21:08:00.560254 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.560211 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-qt5tx"] Apr 16 21:08:00.561261 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:08:00.561227 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc75a165c_defc_466d_b323_1f93e3d0d9a8.slice/crio-5d205814eeb2932f6c35350bb703f8b859fbeab934dd9b7066d3810c8cfdba07 WatchSource:0}: Error finding container 5d205814eeb2932f6c35350bb703f8b859fbeab934dd9b7066d3810c8cfdba07: Status 404 returned error can't find the container with id 5d205814eeb2932f6c35350bb703f8b859fbeab934dd9b7066d3810c8cfdba07 Apr 16 21:08:00.624754 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.624355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wck\" (UniqueName: \"kubernetes.io/projected/61ce2ed5-1933-4d5a-b0d8-7b9cec743214-kube-api-access-k5wck\") pod \"authorino-6dcdbc9cd4-94tz6\" (UID: \"61ce2ed5-1933-4d5a-b0d8-7b9cec743214\") " pod="kuadrant-system/authorino-6dcdbc9cd4-94tz6" Apr 16 21:08:00.635551 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.635514 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6dcdbc9cd4-94tz6"] Apr 16 21:08:00.635846 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:08:00.635819 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-k5wck], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-6dcdbc9cd4-94tz6" podUID="61ce2ed5-1933-4d5a-b0d8-7b9cec743214" Apr 16 21:08:00.644388 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.644361 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wck\" (UniqueName: \"kubernetes.io/projected/61ce2ed5-1933-4d5a-b0d8-7b9cec743214-kube-api-access-k5wck\") pod \"authorino-6dcdbc9cd4-94tz6\" (UID: \"61ce2ed5-1933-4d5a-b0d8-7b9cec743214\") " pod="kuadrant-system/authorino-6dcdbc9cd4-94tz6" Apr 16 21:08:00.671014 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.670619 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5966cb86f6-n7t9z"] Apr 16 21:08:00.678576 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.675062 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:08:00.678576 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.678354 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 21:08:00.680204 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.680177 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5966cb86f6-n7t9z"] Apr 16 21:08:00.721814 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.721785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" event={"ID":"c75a165c-defc-466d-b323-1f93e3d0d9a8","Type":"ContainerStarted","Data":"5d205814eeb2932f6c35350bb703f8b859fbeab934dd9b7066d3810c8cfdba07"} Apr 16 21:08:00.721932 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.721796 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6dcdbc9cd4-94tz6" Apr 16 21:08:00.725385 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.725365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nt6d\" (UniqueName: \"kubernetes.io/projected/f9c5c023-961a-4b4d-9ab8-e573f961293e-kube-api-access-5nt6d\") pod \"authorino-5966cb86f6-n7t9z\" (UID: \"f9c5c023-961a-4b4d-9ab8-e573f961293e\") " pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:08:00.725497 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.725449 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f9c5c023-961a-4b4d-9ab8-e573f961293e-tls-cert\") pod \"authorino-5966cb86f6-n7t9z\" (UID: \"f9c5c023-961a-4b4d-9ab8-e573f961293e\") " pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:08:00.725839 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.725817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6dcdbc9cd4-94tz6" Apr 16 21:08:00.826416 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.826382 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5wck\" (UniqueName: \"kubernetes.io/projected/61ce2ed5-1933-4d5a-b0d8-7b9cec743214-kube-api-access-k5wck\") pod \"61ce2ed5-1933-4d5a-b0d8-7b9cec743214\" (UID: \"61ce2ed5-1933-4d5a-b0d8-7b9cec743214\") " Apr 16 21:08:00.826563 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.826477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nt6d\" (UniqueName: \"kubernetes.io/projected/f9c5c023-961a-4b4d-9ab8-e573f961293e-kube-api-access-5nt6d\") pod \"authorino-5966cb86f6-n7t9z\" (UID: \"f9c5c023-961a-4b4d-9ab8-e573f961293e\") " pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:08:00.826563 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.826529 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f9c5c023-961a-4b4d-9ab8-e573f961293e-tls-cert\") pod \"authorino-5966cb86f6-n7t9z\" (UID: \"f9c5c023-961a-4b4d-9ab8-e573f961293e\") " pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:08:00.828683 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.828652 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ce2ed5-1933-4d5a-b0d8-7b9cec743214-kube-api-access-k5wck" (OuterVolumeSpecName: "kube-api-access-k5wck") pod "61ce2ed5-1933-4d5a-b0d8-7b9cec743214" (UID: "61ce2ed5-1933-4d5a-b0d8-7b9cec743214"). InnerVolumeSpecName "kube-api-access-k5wck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:08:00.829012 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.828977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f9c5c023-961a-4b4d-9ab8-e573f961293e-tls-cert\") pod \"authorino-5966cb86f6-n7t9z\" (UID: \"f9c5c023-961a-4b4d-9ab8-e573f961293e\") " pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:08:00.835717 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.835698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nt6d\" (UniqueName: \"kubernetes.io/projected/f9c5c023-961a-4b4d-9ab8-e573f961293e-kube-api-access-5nt6d\") pod \"authorino-5966cb86f6-n7t9z\" (UID: \"f9c5c023-961a-4b4d-9ab8-e573f961293e\") " pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:08:00.927070 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.927047 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5wck\" (UniqueName: \"kubernetes.io/projected/61ce2ed5-1933-4d5a-b0d8-7b9cec743214-kube-api-access-k5wck\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:08:00.987725 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:00.987700 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:08:01.141311 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.141282 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5966cb86f6-n7t9z"] Apr 16 21:08:01.141917 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:08:01.141892 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9c5c023_961a_4b4d_9ab8_e573f961293e.slice/crio-06e2f1af79624ccd086867d915fb06aa45f8f5468c39078a65de0910b6cb6cf5 WatchSource:0}: Error finding container 06e2f1af79624ccd086867d915fb06aa45f8f5468c39078a65de0910b6cb6cf5: Status 404 returned error can't find the container with id 06e2f1af79624ccd086867d915fb06aa45f8f5468c39078a65de0910b6cb6cf5 Apr 16 21:08:01.726796 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.726757 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" event={"ID":"c75a165c-defc-466d-b323-1f93e3d0d9a8","Type":"ContainerStarted","Data":"f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5"} Apr 16 21:08:01.727260 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.726815 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" podUID="c75a165c-defc-466d-b323-1f93e3d0d9a8" containerName="authorino" containerID="cri-o://f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5" gracePeriod=30 Apr 16 21:08:01.728384 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.728367 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6dcdbc9cd4-94tz6" Apr 16 21:08:01.728502 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.728392 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" event={"ID":"f9c5c023-961a-4b4d-9ab8-e573f961293e","Type":"ContainerStarted","Data":"f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12"} Apr 16 21:08:01.728502 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.728429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" event={"ID":"f9c5c023-961a-4b4d-9ab8-e573f961293e","Type":"ContainerStarted","Data":"06e2f1af79624ccd086867d915fb06aa45f8f5468c39078a65de0910b6cb6cf5"} Apr 16 21:08:01.742788 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.742742 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" podStartSLOduration=1.379144962 podStartE2EDuration="1.742725607s" podCreationTimestamp="2026-04-16 21:08:00 +0000 UTC" firstStartedPulling="2026-04-16 21:08:00.5625954 +0000 UTC m=+583.580906225" lastFinishedPulling="2026-04-16 21:08:00.926176044 +0000 UTC m=+583.944486870" observedRunningTime="2026-04-16 21:08:01.741957714 +0000 UTC m=+584.760268563" watchObservedRunningTime="2026-04-16 21:08:01.742725607 +0000 UTC m=+584.761036455" Apr 16 21:08:01.769602 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.769557 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6dcdbc9cd4-94tz6"] Apr 16 21:08:01.773900 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.773867 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6dcdbc9cd4-94tz6"] Apr 16 21:08:01.788254 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.787961 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" podStartSLOduration=1.396794358 podStartE2EDuration="1.787945641s" podCreationTimestamp="2026-04-16 21:08:00 +0000 UTC" firstStartedPulling="2026-04-16 21:08:01.143283767 +0000 UTC m=+584.161594591" lastFinishedPulling="2026-04-16 21:08:01.534435048 +0000 UTC m=+584.552745874" observedRunningTime="2026-04-16 21:08:01.786886155 +0000 UTC m=+584.805197000" watchObservedRunningTime="2026-04-16 21:08:01.787945641 +0000 UTC m=+584.806256489" Apr 16 21:08:01.818653 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.818615 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-ppczz"] Apr 16 21:08:01.818839 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:01.818819 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-ppczz" podUID="0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a" containerName="authorino" containerID="cri-o://e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4" gracePeriod=30 Apr 16 21:08:02.105746 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.105721 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" Apr 16 21:08:02.114680 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.114660 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ppczz" Apr 16 21:08:02.140820 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.140796 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv767\" (UniqueName: \"kubernetes.io/projected/c75a165c-defc-466d-b323-1f93e3d0d9a8-kube-api-access-vv767\") pod \"c75a165c-defc-466d-b323-1f93e3d0d9a8\" (UID: \"c75a165c-defc-466d-b323-1f93e3d0d9a8\") " Apr 16 21:08:02.140969 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.140936 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjgbb\" (UniqueName: \"kubernetes.io/projected/0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a-kube-api-access-zjgbb\") pod \"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a\" (UID: \"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a\") " Apr 16 21:08:02.143088 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.143064 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75a165c-defc-466d-b323-1f93e3d0d9a8-kube-api-access-vv767" (OuterVolumeSpecName: "kube-api-access-vv767") pod "c75a165c-defc-466d-b323-1f93e3d0d9a8" (UID: "c75a165c-defc-466d-b323-1f93e3d0d9a8"). InnerVolumeSpecName "kube-api-access-vv767". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:08:02.143190 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.143119 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a-kube-api-access-zjgbb" (OuterVolumeSpecName: "kube-api-access-zjgbb") pod "0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a" (UID: "0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a"). InnerVolumeSpecName "kube-api-access-zjgbb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:08:02.242365 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.242334 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vv767\" (UniqueName: \"kubernetes.io/projected/c75a165c-defc-466d-b323-1f93e3d0d9a8-kube-api-access-vv767\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:08:02.242365 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.242361 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zjgbb\" (UniqueName: \"kubernetes.io/projected/0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a-kube-api-access-zjgbb\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:08:02.732944 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.732912 2579 generic.go:358] "Generic (PLEG): container finished" podID="0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a" containerID="e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4" exitCode=0 Apr 16 21:08:02.733418 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.732957 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ppczz" Apr 16 21:08:02.733418 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.733008 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ppczz" event={"ID":"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a","Type":"ContainerDied","Data":"e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4"} Apr 16 21:08:02.733418 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.733039 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ppczz" event={"ID":"0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a","Type":"ContainerDied","Data":"05b12b860e5558fc113e89decb9e1eb1607c3266dc2fa3df5688bea72997e468"} Apr 16 21:08:02.733418 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.733057 2579 scope.go:117] "RemoveContainer" containerID="e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4" Apr 16 21:08:02.734877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.734372 2579 generic.go:358] "Generic (PLEG): container finished" podID="c75a165c-defc-466d-b323-1f93e3d0d9a8" containerID="f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5" exitCode=0 Apr 16 21:08:02.734877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.734415 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" Apr 16 21:08:02.734877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.734447 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" event={"ID":"c75a165c-defc-466d-b323-1f93e3d0d9a8","Type":"ContainerDied","Data":"f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5"} Apr 16 21:08:02.734877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.734601 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-qt5tx" event={"ID":"c75a165c-defc-466d-b323-1f93e3d0d9a8","Type":"ContainerDied","Data":"5d205814eeb2932f6c35350bb703f8b859fbeab934dd9b7066d3810c8cfdba07"} Apr 16 21:08:02.742874 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.742853 2579 scope.go:117] "RemoveContainer" containerID="e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4" Apr 16 21:08:02.743200 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:08:02.743183 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4\": container with ID starting with e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4 not found: ID does not exist" containerID="e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4" Apr 16 21:08:02.743269 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.743207 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4"} err="failed to get container status \"e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4\": rpc error: code = NotFound desc = could not find container \"e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4\": container with ID starting with e2a19811f1a07fa5916fd408ff07cf5d41a045d87071e179858789af2d17c0c4 not found: ID does not exist" Apr 16 21:08:02.743269 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.743223 2579 scope.go:117] "RemoveContainer" containerID="f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5" Apr 16 21:08:02.763179 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.763142 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-ppczz"] Apr 16 21:08:02.765123 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.765096 2579 scope.go:117] "RemoveContainer" containerID="f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5" Apr 16 21:08:02.765450 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:08:02.765426 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5\": container with ID starting with f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5 not found: ID does not exist" containerID="f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5" Apr 16 21:08:02.765520 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.765460 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5"} err="failed to get container status \"f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5\": rpc error: code = NotFound desc = could not find container \"f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5\": container with ID starting with f7d92fe57c9f48e71173b193e4ca7ccdfbc373ea539649d59f9be7a559f10cd5 not found: ID does not exist" Apr 16 21:08:02.766420 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.766398 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-ppczz"] Apr 16 21:08:02.780264 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.780241 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-qt5tx"] Apr 16 21:08:02.783523 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:02.783500 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-qt5tx"] Apr 16 21:08:03.567056 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:03.567026 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a" path="/var/lib/kubelet/pods/0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a/volumes" Apr 16 21:08:03.567354 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:03.567342 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ce2ed5-1933-4d5a-b0d8-7b9cec743214" path="/var/lib/kubelet/pods/61ce2ed5-1933-4d5a-b0d8-7b9cec743214/volumes" Apr 16 21:08:03.567537 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:03.567527 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75a165c-defc-466d-b323-1f93e3d0d9a8" path="/var/lib/kubelet/pods/c75a165c-defc-466d-b323-1f93e3d0d9a8/volumes" Apr 16 21:08:17.494565 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:17.494529 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:08:17.496142 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:08:17.496123 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:10:08.177353 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.177275 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6978ccf9df-fc6hf"] Apr 16 21:10:08.177741 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.177634 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75a165c-defc-466d-b323-1f93e3d0d9a8" containerName="authorino" Apr 16 21:10:08.177741 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.177646 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75a165c-defc-466d-b323-1f93e3d0d9a8" containerName="authorino" Apr 16 21:10:08.177741 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.177657 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a" containerName="authorino" Apr 16 21:10:08.177741 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.177662 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a" containerName="authorino" Apr 16 21:10:08.177741 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.177719 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c75a165c-defc-466d-b323-1f93e3d0d9a8" containerName="authorino" Apr 16 21:10:08.177741 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.177728 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cfd0c5a-3939-4eb3-aca0-6cc4fec0f55a" containerName="authorino" Apr 16 21:10:08.179658 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.179640 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6978ccf9df-fc6hf" Apr 16 21:10:08.188292 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.188266 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6978ccf9df-fc6hf"] Apr 16 21:10:08.281772 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.281740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcxpw\" (UniqueName: \"kubernetes.io/projected/d996976f-7c88-4d73-9c33-8c044a9463e3-kube-api-access-kcxpw\") pod \"authorino-6978ccf9df-fc6hf\" (UID: \"d996976f-7c88-4d73-9c33-8c044a9463e3\") " pod="kuadrant-system/authorino-6978ccf9df-fc6hf" Apr 16 21:10:08.281948 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.281811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d996976f-7c88-4d73-9c33-8c044a9463e3-tls-cert\") pod \"authorino-6978ccf9df-fc6hf\" (UID: \"d996976f-7c88-4d73-9c33-8c044a9463e3\") " pod="kuadrant-system/authorino-6978ccf9df-fc6hf" Apr 16 21:10:08.383226 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.383191 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d996976f-7c88-4d73-9c33-8c044a9463e3-tls-cert\") pod \"authorino-6978ccf9df-fc6hf\" (UID: \"d996976f-7c88-4d73-9c33-8c044a9463e3\") " pod="kuadrant-system/authorino-6978ccf9df-fc6hf" Apr 16 21:10:08.383417 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.383258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcxpw\" (UniqueName: \"kubernetes.io/projected/d996976f-7c88-4d73-9c33-8c044a9463e3-kube-api-access-kcxpw\") pod \"authorino-6978ccf9df-fc6hf\" (UID: \"d996976f-7c88-4d73-9c33-8c044a9463e3\") " pod="kuadrant-system/authorino-6978ccf9df-fc6hf" Apr 16 21:10:08.385824 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.385797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d996976f-7c88-4d73-9c33-8c044a9463e3-tls-cert\") pod \"authorino-6978ccf9df-fc6hf\" (UID: \"d996976f-7c88-4d73-9c33-8c044a9463e3\") " pod="kuadrant-system/authorino-6978ccf9df-fc6hf" Apr 16 21:10:08.391627 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.391599 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcxpw\" (UniqueName: \"kubernetes.io/projected/d996976f-7c88-4d73-9c33-8c044a9463e3-kube-api-access-kcxpw\") pod \"authorino-6978ccf9df-fc6hf\" (UID: \"d996976f-7c88-4d73-9c33-8c044a9463e3\") " pod="kuadrant-system/authorino-6978ccf9df-fc6hf" Apr 16 21:10:08.489024 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.488974 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6978ccf9df-fc6hf" Apr 16 21:10:08.622814 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.622787 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6978ccf9df-fc6hf"] Apr 16 21:10:08.624407 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:10:08.624377 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd996976f_7c88_4d73_9c33_8c044a9463e3.slice/crio-6f1686f45c96b71e90a2b51c4c34d7cae638dbee511a955484ed6eee7fbaeff5 WatchSource:0}: Error finding container 6f1686f45c96b71e90a2b51c4c34d7cae638dbee511a955484ed6eee7fbaeff5: Status 404 returned error can't find the container with id 6f1686f45c96b71e90a2b51c4c34d7cae638dbee511a955484ed6eee7fbaeff5 Apr 16 21:10:08.626039 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:08.626022 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:10:09.226125 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.226072 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6978ccf9df-fc6hf" event={"ID":"d996976f-7c88-4d73-9c33-8c044a9463e3","Type":"ContainerStarted","Data":"afac8cf9f777bb27e160c20b34287c680b24b3796944f8e0ce3b68bd343e01c8"} Apr 16 21:10:09.226524 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.226137 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6978ccf9df-fc6hf" event={"ID":"d996976f-7c88-4d73-9c33-8c044a9463e3","Type":"ContainerStarted","Data":"6f1686f45c96b71e90a2b51c4c34d7cae638dbee511a955484ed6eee7fbaeff5"} Apr 16 21:10:09.272858 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.272752 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6978ccf9df-fc6hf" podStartSLOduration=0.908948853 podStartE2EDuration="1.272735469s" podCreationTimestamp="2026-04-16 21:10:08 +0000 UTC" firstStartedPulling="2026-04-16 21:10:08.626138642 +0000 UTC m=+711.644449468" lastFinishedPulling="2026-04-16 21:10:08.989925248 +0000 UTC m=+712.008236084" observedRunningTime="2026-04-16 21:10:09.271749616 +0000 UTC m=+712.290060463" watchObservedRunningTime="2026-04-16 21:10:09.272735469 +0000 UTC m=+712.291046315" Apr 16 21:10:09.318258 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.318225 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5966cb86f6-n7t9z"] Apr 16 21:10:09.318488 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.318465 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" podUID="f9c5c023-961a-4b4d-9ab8-e573f961293e" containerName="authorino" containerID="cri-o://f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12" gracePeriod=30 Apr 16 21:10:09.567133 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.567107 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:10:09.595389 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.595350 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f9c5c023-961a-4b4d-9ab8-e573f961293e-tls-cert\") pod \"f9c5c023-961a-4b4d-9ab8-e573f961293e\" (UID: \"f9c5c023-961a-4b4d-9ab8-e573f961293e\") " Apr 16 21:10:09.595559 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.595438 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nt6d\" (UniqueName: \"kubernetes.io/projected/f9c5c023-961a-4b4d-9ab8-e573f961293e-kube-api-access-5nt6d\") pod \"f9c5c023-961a-4b4d-9ab8-e573f961293e\" (UID: \"f9c5c023-961a-4b4d-9ab8-e573f961293e\") " Apr 16 21:10:09.598133 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.598099 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c5c023-961a-4b4d-9ab8-e573f961293e-kube-api-access-5nt6d" (OuterVolumeSpecName: "kube-api-access-5nt6d") pod "f9c5c023-961a-4b4d-9ab8-e573f961293e" (UID: "f9c5c023-961a-4b4d-9ab8-e573f961293e"). InnerVolumeSpecName "kube-api-access-5nt6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:10:09.609742 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.609709 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c5c023-961a-4b4d-9ab8-e573f961293e-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "f9c5c023-961a-4b4d-9ab8-e573f961293e" (UID: "f9c5c023-961a-4b4d-9ab8-e573f961293e"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:10:09.696815 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.696779 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5nt6d\" (UniqueName: \"kubernetes.io/projected/f9c5c023-961a-4b4d-9ab8-e573f961293e-kube-api-access-5nt6d\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:10:09.696815 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:09.696807 2579 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f9c5c023-961a-4b4d-9ab8-e573f961293e-tls-cert\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:10:10.231141 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.231107 2579 generic.go:358] "Generic (PLEG): container finished" podID="f9c5c023-961a-4b4d-9ab8-e573f961293e" containerID="f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12" exitCode=0 Apr 16 21:10:10.231541 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.231174 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" Apr 16 21:10:10.231541 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.231182 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" event={"ID":"f9c5c023-961a-4b4d-9ab8-e573f961293e","Type":"ContainerDied","Data":"f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12"} Apr 16 21:10:10.231541 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.231218 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5966cb86f6-n7t9z" event={"ID":"f9c5c023-961a-4b4d-9ab8-e573f961293e","Type":"ContainerDied","Data":"06e2f1af79624ccd086867d915fb06aa45f8f5468c39078a65de0910b6cb6cf5"} Apr 16 21:10:10.231541 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.231233 2579 scope.go:117] "RemoveContainer" containerID="f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12" Apr 16 21:10:10.240595 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.240573 2579 scope.go:117] "RemoveContainer" containerID="f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12" Apr 16 21:10:10.240834 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:10:10.240815 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12\": container with ID starting with f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12 not found: ID does not exist" containerID="f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12" Apr 16 21:10:10.240877 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.240842 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12"} err="failed to get container status \"f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12\": rpc error: code = NotFound desc = could not find container \"f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12\": container with ID starting with f2769b7da8c0cf9a2c41d134aae45e7621619aedec06be1fed50fc49f193dc12 not found: ID does not exist" Apr 16 21:10:10.254268 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.254244 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5966cb86f6-n7t9z"] Apr 16 21:10:10.262557 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:10.262531 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5966cb86f6-n7t9z"] Apr 16 21:10:11.567765 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:10:11.567727 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c5c023-961a-4b4d-9ab8-e573f961293e" path="/var/lib/kubelet/pods/f9c5c023-961a-4b4d-9ab8-e573f961293e/volumes" Apr 16 21:13:17.526653 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:13:17.526576 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:13:17.528339 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:13:17.528317 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:15:00.199352 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.199276 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606235-hjll4"] Apr 16 21:15:00.200819 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.199684 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9c5c023-961a-4b4d-9ab8-e573f961293e" containerName="authorino" Apr 16 21:15:00.200819 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.199697 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c5c023-961a-4b4d-9ab8-e573f961293e" containerName="authorino" Apr 16 21:15:00.200819 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.199751 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9c5c023-961a-4b4d-9ab8-e573f961293e" containerName="authorino" Apr 16 21:15:00.201526 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.201510 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" Apr 16 21:15:00.205691 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.205670 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-nxn52\"" Apr 16 21:15:00.257277 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.257254 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4bxr\" (UniqueName: \"kubernetes.io/projected/c82e769c-2460-4a76-b26a-70c341e78037-kube-api-access-c4bxr\") pod \"maas-api-key-cleanup-29606235-hjll4\" (UID: \"c82e769c-2460-4a76-b26a-70c341e78037\") " pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" Apr 16 21:15:00.284692 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.284669 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606235-hjll4"] Apr 16 21:15:00.358128 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.358106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4bxr\" (UniqueName: \"kubernetes.io/projected/c82e769c-2460-4a76-b26a-70c341e78037-kube-api-access-c4bxr\") pod \"maas-api-key-cleanup-29606235-hjll4\" (UID: \"c82e769c-2460-4a76-b26a-70c341e78037\") " pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" Apr 16 21:15:00.369356 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.369331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4bxr\" (UniqueName: \"kubernetes.io/projected/c82e769c-2460-4a76-b26a-70c341e78037-kube-api-access-c4bxr\") pod \"maas-api-key-cleanup-29606235-hjll4\" (UID: \"c82e769c-2460-4a76-b26a-70c341e78037\") " pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" Apr 16 21:15:00.511525 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.511491 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" Apr 16 21:15:00.854440 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:00.854409 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606235-hjll4"] Apr 16 21:15:00.855685 ip-10-0-139-17 kubenswrapper[2579]: W0416 21:15:00.855657 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc82e769c_2460_4a76_b26a_70c341e78037.slice/crio-c9dffb807df2603b2cece7ddac4d11249ae728ca3af400dbc73b01b5ec4a7944 WatchSource:0}: Error finding container c9dffb807df2603b2cece7ddac4d11249ae728ca3af400dbc73b01b5ec4a7944: Status 404 returned error can't find the container with id c9dffb807df2603b2cece7ddac4d11249ae728ca3af400dbc73b01b5ec4a7944 Apr 16 21:15:01.398478 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:01.398442 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" event={"ID":"c82e769c-2460-4a76-b26a-70c341e78037","Type":"ContainerStarted","Data":"c9dffb807df2603b2cece7ddac4d11249ae728ca3af400dbc73b01b5ec4a7944"} Apr 16 21:15:03.407455 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:03.407421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" event={"ID":"c82e769c-2460-4a76-b26a-70c341e78037","Type":"ContainerStarted","Data":"35189a0d8f233062915c2ccd45bd74328f4fbeb6a11a5477e53119ae5e20741a"} Apr 16 21:15:03.433280 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:03.433230 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" podStartSLOduration=1.60045373 podStartE2EDuration="3.433217033s" podCreationTimestamp="2026-04-16 21:15:00 +0000 UTC" firstStartedPulling="2026-04-16 21:15:00.857485888 +0000 UTC m=+1003.875796713" lastFinishedPulling="2026-04-16 21:15:02.690249187 +0000 UTC m=+1005.708560016" observedRunningTime="2026-04-16 21:15:03.430398307 +0000 UTC m=+1006.448709154" watchObservedRunningTime="2026-04-16 21:15:03.433217033 +0000 UTC m=+1006.451527880" Apr 16 21:15:23.486643 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:23.486600 2579 generic.go:358] "Generic (PLEG): container finished" podID="c82e769c-2460-4a76-b26a-70c341e78037" containerID="35189a0d8f233062915c2ccd45bd74328f4fbeb6a11a5477e53119ae5e20741a" exitCode=6 Apr 16 21:15:23.487066 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:23.486678 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" event={"ID":"c82e769c-2460-4a76-b26a-70c341e78037","Type":"ContainerDied","Data":"35189a0d8f233062915c2ccd45bd74328f4fbeb6a11a5477e53119ae5e20741a"} Apr 16 21:15:23.487128 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:23.487073 2579 scope.go:117] "RemoveContainer" containerID="35189a0d8f233062915c2ccd45bd74328f4fbeb6a11a5477e53119ae5e20741a" Apr 16 21:15:23.487867 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:23.487853 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:15:24.492834 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:24.492799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" event={"ID":"c82e769c-2460-4a76-b26a-70c341e78037","Type":"ContainerStarted","Data":"b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966"} Apr 16 21:15:44.581101 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:44.581065 2579 generic.go:358] "Generic (PLEG): container finished" podID="c82e769c-2460-4a76-b26a-70c341e78037" containerID="b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966" exitCode=6 Apr 16 21:15:44.581559 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:44.581140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" event={"ID":"c82e769c-2460-4a76-b26a-70c341e78037","Type":"ContainerDied","Data":"b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966"} Apr 16 21:15:44.581559 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:44.581192 2579 scope.go:117] "RemoveContainer" containerID="35189a0d8f233062915c2ccd45bd74328f4fbeb6a11a5477e53119ae5e20741a" Apr 16 21:15:44.581559 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:44.581467 2579 scope.go:117] "RemoveContainer" containerID="b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966" Apr 16 21:15:44.581694 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:15:44.581664 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29606235-hjll4_opendatahub(c82e769c-2460-4a76-b26a-70c341e78037)\"" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" podUID="c82e769c-2460-4a76-b26a-70c341e78037" Apr 16 21:15:57.566578 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:57.566550 2579 scope.go:117] "RemoveContainer" containerID="b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966" Apr 16 21:15:58.635851 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:58.635813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" event={"ID":"c82e769c-2460-4a76-b26a-70c341e78037","Type":"ContainerStarted","Data":"5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3"} Apr 16 21:15:59.706065 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:59.706032 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606235-hjll4"] Apr 16 21:15:59.706520 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:15:59.706357 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" containerID="cri-o://5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3" gracePeriod=30 Apr 16 21:16:18.452922 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.452899 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" Apr 16 21:16:18.504631 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.504601 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4bxr\" (UniqueName: \"kubernetes.io/projected/c82e769c-2460-4a76-b26a-70c341e78037-kube-api-access-c4bxr\") pod \"c82e769c-2460-4a76-b26a-70c341e78037\" (UID: \"c82e769c-2460-4a76-b26a-70c341e78037\") " Apr 16 21:16:18.507106 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.507033 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82e769c-2460-4a76-b26a-70c341e78037-kube-api-access-c4bxr" (OuterVolumeSpecName: "kube-api-access-c4bxr") pod "c82e769c-2460-4a76-b26a-70c341e78037" (UID: "c82e769c-2460-4a76-b26a-70c341e78037"). InnerVolumeSpecName "kube-api-access-c4bxr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:16:18.605869 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.605836 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c4bxr\" (UniqueName: \"kubernetes.io/projected/c82e769c-2460-4a76-b26a-70c341e78037-kube-api-access-c4bxr\") on node \"ip-10-0-139-17.ec2.internal\" DevicePath \"\"" Apr 16 21:16:18.715883 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.715854 2579 generic.go:358] "Generic (PLEG): container finished" podID="c82e769c-2460-4a76-b26a-70c341e78037" containerID="5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3" exitCode=6 Apr 16 21:16:18.716067 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.715914 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" Apr 16 21:16:18.716067 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.715923 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" event={"ID":"c82e769c-2460-4a76-b26a-70c341e78037","Type":"ContainerDied","Data":"5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3"} Apr 16 21:16:18.716067 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.715950 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606235-hjll4" event={"ID":"c82e769c-2460-4a76-b26a-70c341e78037","Type":"ContainerDied","Data":"c9dffb807df2603b2cece7ddac4d11249ae728ca3af400dbc73b01b5ec4a7944"} Apr 16 21:16:18.716067 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.715965 2579 scope.go:117] "RemoveContainer" containerID="5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3" Apr 16 21:16:18.725921 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.725902 2579 scope.go:117] "RemoveContainer" containerID="b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966" Apr 16 21:16:18.733967 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.733950 2579 scope.go:117] "RemoveContainer" containerID="5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3" Apr 16 21:16:18.734291 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:16:18.734268 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3\": container with ID starting with 5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3 not found: ID does not exist" containerID="5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3" Apr 16 21:16:18.734363 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.734297 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3"} err="failed to get container status \"5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3\": rpc error: code = NotFound desc = could not find container \"5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3\": container with ID starting with 5464093976a32e33e9d729ed0e31928a833887382f3ae1bffd5adf5f721c71c3 not found: ID does not exist" Apr 16 21:16:18.734363 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.734315 2579 scope.go:117] "RemoveContainer" containerID="b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966" Apr 16 21:16:18.734547 ip-10-0-139-17 kubenswrapper[2579]: E0416 21:16:18.734531 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966\": container with ID starting with b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966 not found: ID does not exist" containerID="b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966" Apr 16 21:16:18.734585 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.734551 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966"} err="failed to get container status \"b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966\": rpc error: code = NotFound desc = could not find container \"b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966\": container with ID starting with b365e675d7fbd2fcdbece98c63fcd694e9cdacc6b4dbd55e0f47cc3f56002966 not found: ID does not exist" Apr 16 21:16:18.745036 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.745015 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606235-hjll4"] Apr 16 21:16:18.749399 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:18.749380 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606235-hjll4"] Apr 16 21:16:19.567369 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:16:19.567336 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82e769c-2460-4a76-b26a-70c341e78037" path="/var/lib/kubelet/pods/c82e769c-2460-4a76-b26a-70c341e78037/volumes" Apr 16 21:18:17.559672 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:18:17.559645 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:18:17.562394 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:18:17.562371 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:23:17.609640 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:23:17.609612 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:23:17.610226 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:23:17.609823 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:28:06.609289 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:06.609195 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6978ccf9df-fc6hf_d996976f-7c88-4d73-9c33-8c044a9463e3/authorino/0.log" Apr 16 21:28:11.075103 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:11.075071 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f94c666bb-bt9s5_9d088b51-5307-4c1c-9ca5-b8e3a0e90470/manager/0.log" Apr 16 21:28:12.084854 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.084822 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2_fff5c780-0c3c-4f84-ab04-1411873d94a8/util/0.log" Apr 16 21:28:12.092398 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.092378 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2_fff5c780-0c3c-4f84-ab04-1411873d94a8/pull/0.log" Apr 16 21:28:12.099342 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.099325 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2_fff5c780-0c3c-4f84-ab04-1411873d94a8/extract/0.log" Apr 16 21:28:12.214470 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.214444 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt_928ad48f-db3c-450a-952c-5ab71bd07fd5/util/0.log" Apr 16 21:28:12.221383 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.221362 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt_928ad48f-db3c-450a-952c-5ab71bd07fd5/pull/0.log" Apr 16 21:28:12.228346 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.228327 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt_928ad48f-db3c-450a-952c-5ab71bd07fd5/extract/0.log" Apr 16 21:28:12.342469 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.342390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp_a9997b85-3b6f-4d24-a986-35661e550af2/util/0.log" Apr 16 21:28:12.349077 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.349053 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp_a9997b85-3b6f-4d24-a986-35661e550af2/pull/0.log" Apr 16 21:28:12.356442 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.356427 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp_a9997b85-3b6f-4d24-a986-35661e550af2/extract/0.log" Apr 16 21:28:12.476073 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.476051 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n_cfd6eb27-4070-4d8b-a5f5-2785164f315d/util/0.log" Apr 16 21:28:12.484573 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.484546 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n_cfd6eb27-4070-4d8b-a5f5-2785164f315d/pull/0.log" Apr 16 21:28:12.494563 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.494545 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n_cfd6eb27-4070-4d8b-a5f5-2785164f315d/extract/0.log" Apr 16 21:28:12.611328 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.611234 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6978ccf9df-fc6hf_d996976f-7c88-4d73-9c33-8c044a9463e3/authorino/0.log" Apr 16 21:28:12.740430 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.740402 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-z8xrw_2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e/manager/0.log" Apr 16 21:28:12.860919 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:12.860894 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9stmf_3cccc6dd-4f27-4402-9422-626b6c4e0443/manager/0.log" Apr 16 21:28:13.090006 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:13.089967 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-67xvg_26c1940d-abea-402d-87f0-74283dd29012/registry-server/0.log" Apr 16 21:28:13.447927 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:13.447852 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-sv756_c62edaed-5da9-40d8-b721-37d1746992e5/manager/0.log" Apr 16 21:28:13.789284 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:13.789256 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n_63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b/istio-proxy/0.log" Apr 16 21:28:14.259332 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:14.259304 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-dcmj9_53e43b49-7372-442d-99ef-4171fe78701b/istio-proxy/0.log" Apr 16 21:28:14.377679 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:14.377656 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b575bfc5d-gbhzj_9af0bfa4-fd46-49df-973a-66814186f9c6/router/0.log" Apr 16 21:28:17.645389 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:17.645297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:28:17.645389 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:17.645373 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s6cxs_6729dda1-3d66-4a8a-a99c-69840130dbf7/ovn-acl-logging/0.log" Apr 16 21:28:22.070181 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:22.070152 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ww9g9_75834fc1-5f34-4f79-be84-f3349cdd5efd/global-pull-secret-syncer/0.log" Apr 16 21:28:22.161894 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:22.161866 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ktmcs_c26f028d-d200-4cbf-b0a5-93e063163a20/konnectivity-agent/0.log" Apr 16 21:28:22.256650 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:22.256624 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-17.ec2.internal_a5202bb430809412b670f0c968baf10f/haproxy/0.log" Apr 16 21:28:26.252629 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.252598 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2_fff5c780-0c3c-4f84-ab04-1411873d94a8/extract/0.log" Apr 16 21:28:26.280268 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.280240 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2_fff5c780-0c3c-4f84-ab04-1411873d94a8/util/0.log" Apr 16 21:28:26.305243 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.305221 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7594rvf2_fff5c780-0c3c-4f84-ab04-1411873d94a8/pull/0.log" Apr 16 21:28:26.340400 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.340372 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt_928ad48f-db3c-450a-952c-5ab71bd07fd5/extract/0.log" Apr 16 21:28:26.367323 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.367303 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt_928ad48f-db3c-450a-952c-5ab71bd07fd5/util/0.log" Apr 16 21:28:26.396208 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.396188 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07zlqt_928ad48f-db3c-450a-952c-5ab71bd07fd5/pull/0.log" Apr 16 21:28:26.440164 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.440143 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp_a9997b85-3b6f-4d24-a986-35661e550af2/extract/0.log" Apr 16 21:28:26.470348 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.470327 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp_a9997b85-3b6f-4d24-a986-35661e550af2/util/0.log" Apr 16 21:28:26.494673 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.494650 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed732cthp_a9997b85-3b6f-4d24-a986-35661e550af2/pull/0.log" Apr 16 21:28:26.523711 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.523655 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n_cfd6eb27-4070-4d8b-a5f5-2785164f315d/extract/0.log" Apr 16 21:28:26.549881 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.549853 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n_cfd6eb27-4070-4d8b-a5f5-2785164f315d/util/0.log" Apr 16 21:28:26.575929 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.575907 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14ss2n_cfd6eb27-4070-4d8b-a5f5-2785164f315d/pull/0.log" Apr 16 21:28:26.736182 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.736151 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6978ccf9df-fc6hf_d996976f-7c88-4d73-9c33-8c044a9463e3/authorino/0.log" Apr 16 21:28:26.774159 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.774077 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-z8xrw_2657dcf5-7dc2-458d-aa1b-b2cbc0fb310e/manager/0.log" Apr 16 21:28:26.804956 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.804930 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9stmf_3cccc6dd-4f27-4402-9422-626b6c4e0443/manager/0.log" Apr 16 21:28:26.874596 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:26.874565 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-67xvg_26c1940d-abea-402d-87f0-74283dd29012/registry-server/0.log" Apr 16 21:28:27.182306 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:27.182220 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-sv756_c62edaed-5da9-40d8-b721-37d1746992e5/manager/0.log" Apr 16 21:28:29.059384 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:29.059354 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-j85jc_b29e3035-4704-443b-ba1e-f485ad77b3c5/cluster-monitoring-operator/0.log" Apr 16 21:28:29.190999 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:29.190959 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-665555d85-gk67l_1bcca9b9-1d19-45cf-9384-55ca4eb5e043/metrics-server/0.log" Apr 16 21:28:29.220304 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:29.220272 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-8xsp9_cac20c17-60af-477e-8df8-bce3e9e85927/monitoring-plugin/0.log" Apr 16 21:28:29.432893 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:29.432818 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tz7dk_87590fab-d517-4092-9065-f48998609b50/node-exporter/0.log" Apr 16 21:28:29.466810 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:29.466782 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tz7dk_87590fab-d517-4092-9065-f48998609b50/kube-rbac-proxy/0.log" Apr 16 21:28:29.492576 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:29.492553 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tz7dk_87590fab-d517-4092-9065-f48998609b50/init-textfile/0.log" Apr 16 21:28:29.846046 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:29.846013 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-p9sbl_7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694/prometheus-operator/0.log" Apr 16 21:28:29.866875 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:29.866852 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-p9sbl_7a41b6ff-fd13-4b8b-9c2c-2bdf9179f694/kube-rbac-proxy/0.log" Apr 16 21:28:30.029352 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.029321 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f9ff59d98-rfmpv_39136697-7696-4144-afb2-1cd8a2e847c7/thanos-query/0.log" Apr 16 21:28:30.055939 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.055915 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f9ff59d98-rfmpv_39136697-7696-4144-afb2-1cd8a2e847c7/kube-rbac-proxy-web/0.log" Apr 16 21:28:30.085687 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.085664 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f9ff59d98-rfmpv_39136697-7696-4144-afb2-1cd8a2e847c7/kube-rbac-proxy/0.log" Apr 16 21:28:30.112622 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.112569 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f9ff59d98-rfmpv_39136697-7696-4144-afb2-1cd8a2e847c7/prom-label-proxy/0.log" Apr 16 21:28:30.137446 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.137427 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f9ff59d98-rfmpv_39136697-7696-4144-afb2-1cd8a2e847c7/kube-rbac-proxy-rules/0.log" Apr 16 21:28:30.176235 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.176215 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f9ff59d98-rfmpv_39136697-7696-4144-afb2-1cd8a2e847c7/kube-rbac-proxy-metrics/0.log" Apr 16 21:28:30.696889 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.696855 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs"] Apr 16 21:28:30.697250 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697237 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.697298 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697252 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.697298 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697270 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.697298 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697276 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.697298 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697284 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.697298 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697290 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.697450 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697349 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.697450 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697357 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.697450 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.697364 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c82e769c-2460-4a76-b26a-70c341e78037" containerName="cleanup" Apr 16 21:28:30.700566 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.700545 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.703729 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.703705 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hmfkt\"/\"openshift-service-ca.crt\"" Apr 16 21:28:30.703848 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.703729 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hmfkt\"/\"default-dockercfg-7w55v\"" Apr 16 21:28:30.706822 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.706802 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hmfkt\"/\"kube-root-ca.crt\"" Apr 16 21:28:30.715875 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.715853 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs"] Apr 16 21:28:30.773338 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.773311 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-sys\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.773480 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.773357 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6zh\" (UniqueName: \"kubernetes.io/projected/2d027f92-b6e2-4395-84b7-b06aaf49090a-kube-api-access-7x6zh\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.773480 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.773391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-proc\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.773480 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.773462 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-lib-modules\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.773480 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.773477 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-podres\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874306 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-lib-modules\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874459 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-podres\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874459 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-sys\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874459 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6zh\" (UniqueName: \"kubernetes.io/projected/2d027f92-b6e2-4395-84b7-b06aaf49090a-kube-api-access-7x6zh\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874459 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-lib-modules\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874641 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-proc\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874641 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874481 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-sys\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874641 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874490 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-podres\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.874641 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.874540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2d027f92-b6e2-4395-84b7-b06aaf49090a-proc\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:30.890310 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:30.890279 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6zh\" (UniqueName: \"kubernetes.io/projected/2d027f92-b6e2-4395-84b7-b06aaf49090a-kube-api-access-7x6zh\") pod \"perf-node-gather-daemonset-q6mxs\" (UID: \"2d027f92-b6e2-4395-84b7-b06aaf49090a\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:31.010111 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:31.010082 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:31.144665 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:31.144635 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs"] Apr 16 21:28:31.147366 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:31.145678 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:28:31.606924 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:31.606887 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" event={"ID":"2d027f92-b6e2-4395-84b7-b06aaf49090a","Type":"ContainerStarted","Data":"7b91ee032b9063adc370d54478535f83c506d757efe53dbae0c4707bef304560"} Apr 16 21:28:31.606924 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:31.606926 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" event={"ID":"2d027f92-b6e2-4395-84b7-b06aaf49090a","Type":"ContainerStarted","Data":"9805a1f0a7b821436a69510eb0b016f633ed5e199927dcf0e8b3d2ba13c6bd33"} Apr 16 21:28:31.607153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:31.606962 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:31.629652 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:31.629611 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" podStartSLOduration=1.629595165 podStartE2EDuration="1.629595165s" podCreationTimestamp="2026-04-16 21:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:28:31.627454048 +0000 UTC m=+1814.645764895" watchObservedRunningTime="2026-04-16 21:28:31.629595165 +0000 UTC m=+1814.647906012" Apr 16 21:28:32.283006 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:32.282953 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b444578d9-bfcww_609a24b6-a299-4512-85fa-7e821fa695db/console/0.log" Apr 16 21:28:32.312724 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:32.312701 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-md9h2_d4fdbd49-f42a-49e0-befb-6a581b41d609/download-server/0.log" Apr 16 21:28:33.541116 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:33.541090 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lvrfd_9275ffeb-7ec4-4699-976c-7ef980230018/dns/0.log" Apr 16 21:28:33.565817 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:33.565793 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lvrfd_9275ffeb-7ec4-4699-976c-7ef980230018/kube-rbac-proxy/0.log" Apr 16 21:28:33.683493 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:33.683466 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qdgnp_86523bd2-ac21-4e5d-8cfd-81eb7aa5f405/dns-node-resolver/0.log" Apr 16 21:28:34.232386 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:34.232349 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2g5dj_eb89d7f0-f4fb-459a-8f19-5b03adcf660a/node-ca/0.log" Apr 16 21:28:35.134263 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:35.134224 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfgjp8n_63910eb9-4466-41f0-b9ee-6fe0d7ebfb0b/istio-proxy/0.log" Apr 16 21:28:35.396686 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:35.396589 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-dcmj9_53e43b49-7372-442d-99ef-4171fe78701b/istio-proxy/0.log" Apr 16 21:28:35.418727 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:35.418701 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b575bfc5d-gbhzj_9af0bfa4-fd46-49df-973a-66814186f9c6/router/0.log" Apr 16 21:28:35.973622 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:35.973594 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hv5xj_ebb8fc4a-f559-45f4-be07-93cd44e25e3a/serve-healthcheck-canary/0.log" Apr 16 21:28:36.409694 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:36.409606 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-ntt68_b425b7f9-0015-4de7-81d2-02cd12eb338a/insights-operator/1.log" Apr 16 21:28:36.410075 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:36.409693 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-ntt68_b425b7f9-0015-4de7-81d2-02cd12eb338a/insights-operator/0.log" Apr 16 21:28:36.612273 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:36.612247 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v4wf2_0f4f5f3f-e462-4f22-8504-2743600ff619/kube-rbac-proxy/0.log" Apr 16 21:28:36.639447 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:36.639418 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v4wf2_0f4f5f3f-e462-4f22-8504-2743600ff619/exporter/0.log" Apr 16 21:28:36.663800 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:36.663738 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v4wf2_0f4f5f3f-e462-4f22-8504-2743600ff619/extractor/0.log" Apr 16 21:28:37.621973 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:37.621944 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-q6mxs" Apr 16 21:28:38.806415 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:38.806383 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f94c666bb-bt9s5_9d088b51-5307-4c1c-9ca5-b8e3a0e90470/manager/0.log" Apr 16 21:28:40.180142 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:40.180065 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-c5b769f8c-fm6ps_73699555-7225-43bd-bf69-321bd6ebb6fd/manager/0.log" Apr 16 21:28:44.863313 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:44.863287 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-p9hsx_acccf14f-1908-45d6-a352-e0a6c7fc6a05/kube-storage-version-migrator-operator/1.log" Apr 16 21:28:44.864149 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:44.864130 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-p9hsx_acccf14f-1908-45d6-a352-e0a6c7fc6a05/kube-storage-version-migrator-operator/0.log" Apr 16 21:28:45.873153 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:45.873128 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9zljx_13be4f93-01b1-4633-a3f1-b9d89ab4fed8/kube-multus/0.log" Apr 16 21:28:46.241416 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.241384 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wrfqb_157d2848-acb1-4db0-bb7b-a50ad66888da/kube-multus-additional-cni-plugins/0.log" Apr 16 21:28:46.261558 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.261495 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wrfqb_157d2848-acb1-4db0-bb7b-a50ad66888da/egress-router-binary-copy/0.log" Apr 16 21:28:46.281383 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.281362 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wrfqb_157d2848-acb1-4db0-bb7b-a50ad66888da/cni-plugins/0.log" Apr 16 21:28:46.303438 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.303420 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wrfqb_157d2848-acb1-4db0-bb7b-a50ad66888da/bond-cni-plugin/0.log" Apr 16 21:28:46.324546 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.324526 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wrfqb_157d2848-acb1-4db0-bb7b-a50ad66888da/routeoverride-cni/0.log" Apr 16 21:28:46.345426 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.345401 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wrfqb_157d2848-acb1-4db0-bb7b-a50ad66888da/whereabouts-cni-bincopy/0.log" Apr 16 21:28:46.365733 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.365711 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wrfqb_157d2848-acb1-4db0-bb7b-a50ad66888da/whereabouts-cni/0.log" Apr 16 21:28:46.512529 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.512501 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rgzx9_24ed89ef-c93c-40fd-a75f-2f3fd7582359/network-metrics-daemon/0.log" Apr 16 21:28:46.530942 ip-10-0-139-17 kubenswrapper[2579]: I0416 21:28:46.530913 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rgzx9_24ed89ef-c93c-40fd-a75f-2f3fd7582359/kube-rbac-proxy/0.log"