Apr 21 04:21:31.396601 ip-10-0-134-45 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 04:21:31.396618 ip-10-0-134-45 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 04:21:31.396628 ip-10-0-134-45 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 04:21:31.396965 ip-10-0-134-45 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 04:21:41.516783 ip-10-0-134-45 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 04:21:41.516804 ip-10-0-134-45 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2c08374722334ce585dad15e93a864ca -- Apr 21 04:24:07.902321 ip-10-0-134-45 systemd[1]: Starting Kubernetes Kubelet... Apr 21 04:24:08.279661 ip-10-0-134-45 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:24:08.279661 ip-10-0-134-45 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 04:24:08.279661 ip-10-0-134-45 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:24:08.279661 ip-10-0-134-45 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 04:24:08.279661 ip-10-0-134-45 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:24:08.281085 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.280991 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 04:24:08.286092 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286064 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:24:08.286092 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286087 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:24:08.286092 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286093 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:24:08.286092 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286097 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286101 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286106 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286113 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286117 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286121 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286125 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286129 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286132 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286136 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286140 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286144 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286147 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286151 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286154 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286158 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286161 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286165 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286168 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286178 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:24:08.286352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286182 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286186 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286193 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286199 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286203 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286208 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286213 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286218 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286224 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286229 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286233 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286237 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286243 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286247 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286253 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286257 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286262 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286266 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286270 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:24:08.287053 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286274 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286279 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286283 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286287 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286291 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286296 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286300 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286305 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286309 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286313 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286319 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286323 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286327 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286331 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286335 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286339 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286343 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286348 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286352 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286356 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:24:08.287879 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286359 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286363 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286367 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286371 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286375 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286379 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286383 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286388 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286393 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286397 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286401 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286405 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286409 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286413 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286418 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286422 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286428 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286433 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286437 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286441 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:24:08.288550 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286445 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286449 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286454 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.286458 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287708 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287722 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287727 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287731 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287736 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287740 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287745 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287750 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287755 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287759 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287764 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287769 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287774 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287779 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287784 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287789 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:24:08.289388 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287794 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287799 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287806 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287811 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287816 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287820 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287824 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287829 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287834 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287839 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287843 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287848 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287852 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287856 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287860 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287864 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287869 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287873 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287876 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287880 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:24:08.290240 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287884 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287888 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287894 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287899 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287903 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287909 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287913 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287918 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287923 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287928 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287933 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287937 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287941 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287946 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287950 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287954 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287958 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287963 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287967 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287971 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:24:08.290842 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.287975 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288000 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288005 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288009 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288013 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288018 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288023 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288027 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288030 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288035 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288039 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288043 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288047 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288053 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288057 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288061 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288065 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288071 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288075 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:24:08.291515 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288080 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288090 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288095 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288101 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288106 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288110 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288114 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288118 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288122 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288127 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.288131 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288685 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288700 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288712 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288720 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288728 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288734 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288740 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288746 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288752 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288757 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 04:24:08.292014 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288762 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288768 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288773 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288777 2579 flags.go:64] FLAG: --cgroup-root="" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288782 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288787 2579 flags.go:64] FLAG: --client-ca-file="" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288792 2579 flags.go:64] FLAG: --cloud-config="" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288796 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288801 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288808 2579 flags.go:64] FLAG: --cluster-domain="" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288813 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288818 2579 flags.go:64] FLAG: --config-dir="" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288823 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288828 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288836 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288841 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288846 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288852 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288857 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288862 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288867 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288872 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288877 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288884 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288889 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 04:24:08.292637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288894 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288899 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288905 2579 flags.go:64] FLAG: --enable-server="true" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288909 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288917 2579 flags.go:64] FLAG: --event-burst="100" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288922 2579 flags.go:64] FLAG: --event-qps="50" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288927 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288932 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288937 2579 flags.go:64] FLAG: --eviction-hard="" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288943 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288947 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288953 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288958 2579 flags.go:64] FLAG: --eviction-soft="" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288963 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288968 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288973 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.288996 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289002 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289007 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289012 2579 flags.go:64] FLAG: --feature-gates="" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289018 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289023 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289028 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289033 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289038 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 21 04:24:08.293298 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289044 2579 flags.go:64] FLAG: --help="false" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289048 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289053 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289058 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289063 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289069 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289074 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289079 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289084 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289088 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289094 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289099 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289104 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289109 2579 flags.go:64] FLAG: --kube-reserved="" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289114 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289119 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289125 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289130 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289134 2579 flags.go:64] FLAG: --lock-file="" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289139 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289144 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289150 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289159 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 04:24:08.293912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289165 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289169 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289174 2579 flags.go:64] FLAG: --logging-format="text" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289179 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289185 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289189 2579 flags.go:64] FLAG: --manifest-url="" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289194 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289201 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289206 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289213 2579 flags.go:64] FLAG: --max-pods="110" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289217 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289222 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289227 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289231 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289235 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289240 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289244 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289255 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289259 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289264 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289269 2579 flags.go:64] FLAG: --pod-cidr="" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289274 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289282 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289286 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 04:24:08.294482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289291 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289297 2579 flags.go:64] FLAG: --port="10250" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289303 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289308 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-041b24238f487cbed" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289313 2579 flags.go:64] FLAG: --qos-reserved="" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289317 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289323 2579 flags.go:64] FLAG: --register-node="true" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289328 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289333 2579 flags.go:64] FLAG: --register-with-taints="" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289339 2579 flags.go:64] FLAG: --registry-burst="10" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289344 2579 flags.go:64] FLAG: --registry-qps="5" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289349 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289353 2579 flags.go:64] FLAG: --reserved-memory="" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289360 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289365 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289370 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289375 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289380 2579 flags.go:64] FLAG: --runonce="false" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289385 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289390 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289395 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289399 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289404 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289408 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289413 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289419 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 04:24:08.295136 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289424 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289429 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289435 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289440 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289445 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289457 2579 flags.go:64] FLAG: --system-cgroups="" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289462 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289471 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289476 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289481 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289488 2579 flags.go:64] FLAG: --tls-min-version="" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289493 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289498 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289502 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289508 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289512 2579 flags.go:64] FLAG: --v="2" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289520 2579 flags.go:64] FLAG: --version="false" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289526 2579 flags.go:64] FLAG: --vmodule="" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289533 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.289538 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289692 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289699 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289704 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289710 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:24:08.295808 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289714 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289719 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289723 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289727 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289732 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289736 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289740 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289745 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289749 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289754 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289758 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289762 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289767 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289774 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289778 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289782 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289786 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289791 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289795 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289800 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:24:08.296449 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289804 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289808 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289812 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289816 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289821 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289825 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289829 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289833 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289837 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289844 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289849 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289857 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289861 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289865 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289870 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289877 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289883 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289888 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289893 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:24:08.296959 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289897 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289902 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289906 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289910 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289914 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289918 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289925 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289930 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289934 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289938 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289942 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289946 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289950 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289955 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289958 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289962 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289967 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289971 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289975 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.289994 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:24:08.297463 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290000 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290004 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290008 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290012 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290018 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290022 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290025 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290030 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290034 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290038 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290042 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290046 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290051 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290055 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290058 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290062 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290067 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290071 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290076 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290082 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:24:08.297956 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290087 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290091 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.290095 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.290104 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.296562 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.296580 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296630 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296635 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296638 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296640 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296643 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296647 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296650 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296653 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296656 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296659 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:24:08.298481 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296662 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296665 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296668 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296670 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296675 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296679 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296681 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296684 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296687 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296690 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296693 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296696 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296698 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296701 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296704 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296706 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296709 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296712 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296715 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:24:08.298886 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296717 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296720 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296724 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296727 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296730 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296733 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296735 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296738 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296740 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296743 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296746 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296748 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296751 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296754 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296756 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296759 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296762 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296764 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296767 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296769 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:24:08.299369 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296772 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296775 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296777 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296780 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296782 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296785 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296788 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296790 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296793 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296795 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296797 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296800 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296803 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296805 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296808 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296812 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296816 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296820 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296822 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296825 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:24:08.299889 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296827 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296830 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296833 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296835 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296838 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296840 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296843 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296846 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296848 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296851 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296853 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296856 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296858 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296861 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296863 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296866 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:24:08.300460 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296869 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.296874 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296971 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296976 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296994 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.296999 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297003 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297008 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297011 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297014 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297017 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297019 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297026 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297029 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297032 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297035 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:24:08.300851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297037 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297040 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297043 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297045 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297048 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297050 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297053 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297056 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297058 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297061 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297063 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297066 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297068 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297071 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297073 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297076 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297079 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297081 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297084 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297086 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:24:08.301271 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297089 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297091 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297094 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297097 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297099 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297102 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297105 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297107 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297110 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297113 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297116 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297119 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297121 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297124 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297127 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297129 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297132 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297135 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297138 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297141 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:24:08.301759 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297143 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297146 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297148 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297151 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297154 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297156 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297159 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297161 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297165 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297168 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297171 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297174 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297176 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297179 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297182 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297184 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297187 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297190 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297193 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:24:08.302262 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297195 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297198 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297200 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297203 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297206 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297209 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297211 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297214 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297216 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297218 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297221 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297223 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:08.297226 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.297231 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.298084 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 04:24:08.302717 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.299939 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 04:24:08.303187 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.300802 2579 server.go:1019] "Starting client certificate rotation" Apr 21 04:24:08.303187 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.300900 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:24:08.303187 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.300938 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:24:08.322659 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.322639 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:24:08.327725 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.327705 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:24:08.340961 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.340941 2579 log.go:25] "Validated CRI v1 runtime API" Apr 21 04:24:08.347737 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.347718 2579 log.go:25] "Validated CRI v1 image API" Apr 21 04:24:08.348950 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.348925 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:24:08.349496 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.349479 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 04:24:08.352071 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.352047 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 97015330-e8ba-4b22-bcda-d4ca97c76668:/dev/nvme0n1p3 c899189e-69f0-4028-bf19-68e3bc8bf6fc:/dev/nvme0n1p4] Apr 21 04:24:08.352169 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.352069 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 04:24:08.357652 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.357535 2579 manager.go:217] Machine: {Timestamp:2026-04-21 04:24:08.356352029 +0000 UTC m=+0.351749645 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101825 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23f9deaf3a62ed6708f7e05bc7db50 SystemUUID:ec23f9de-af3a-62ed-6708-f7e05bc7db50 BootID:2c083747-2233-4ce5-85da-d15e93a864ca Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9e:54:52:32:21 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9e:54:52:32:21 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:64:f3:f2:c7:20 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 04:24:08.357733 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.357654 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 04:24:08.357766 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.357744 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 04:24:08.359620 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.359591 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 04:24:08.359811 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.359621 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-45.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 04:24:08.359911 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.359826 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 04:24:08.359911 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.359839 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 04:24:08.359911 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.359858 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:24:08.360630 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.360616 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:24:08.362130 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.362118 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:24:08.362441 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.362429 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 04:24:08.364385 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.364373 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 21 04:24:08.364450 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.364392 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 04:24:08.364450 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.364409 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 04:24:08.364450 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.364423 2579 kubelet.go:397] "Adding apiserver pod source" Apr 21 04:24:08.364450 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.364436 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 04:24:08.365537 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.365524 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:24:08.365599 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.365547 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:24:08.368170 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.368155 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 04:24:08.369304 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.369291 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 04:24:08.369510 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.369493 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w8jbd" Apr 21 04:24:08.371144 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371099 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 04:24:08.371144 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371116 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 04:24:08.371144 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371123 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 04:24:08.371144 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371129 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 04:24:08.371144 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371134 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 04:24:08.371144 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371140 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 04:24:08.371144 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371146 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 04:24:08.371463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371151 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 04:24:08.371463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371159 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 04:24:08.371463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371165 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 04:24:08.371463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371174 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 04:24:08.371463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371185 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 04:24:08.371463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371220 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 04:24:08.371463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.371227 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 04:24:08.374538 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.374522 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w8jbd" Apr 21 04:24:08.375386 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.375373 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 04:24:08.375458 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.375413 2579 server.go:1295] "Started kubelet" Apr 21 04:24:08.375602 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.375573 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 04:24:08.375855 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.375819 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 04:24:08.375905 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.375871 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 04:24:08.376102 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.376067 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-45.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 04:24:08.376160 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.376139 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-45.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 04:24:08.376252 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.376234 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 04:24:08.376298 ip-10-0-134-45 systemd[1]: Started Kubernetes Kubelet. Apr 21 04:24:08.378754 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.378622 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 04:24:08.380072 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.380052 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 21 04:24:08.385443 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.385425 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 04:24:08.385915 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.385881 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 04:24:08.386425 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.386404 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 04:24:08.386568 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.386555 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 04:24:08.386757 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.386720 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:08.386757 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.386556 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 04:24:08.386900 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.386781 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 21 04:24:08.386900 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.386790 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 21 04:24:08.388382 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.388358 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:24:08.388460 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.388450 2579 factory.go:55] Registering systemd factory Apr 21 04:24:08.388523 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.388467 2579 factory.go:223] Registration of the systemd container factory successfully Apr 21 04:24:08.388748 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.388721 2579 factory.go:153] Registering CRI-O factory Apr 21 04:24:08.388748 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.388743 2579 factory.go:223] Registration of the crio container factory successfully Apr 21 04:24:08.388854 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.388794 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 04:24:08.388854 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.388821 2579 factory.go:103] Registering Raw factory Apr 21 04:24:08.388854 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.388838 2579 manager.go:1196] Started watching for new ooms in manager Apr 21 04:24:08.389019 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.388867 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 04:24:08.389327 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.389311 2579 manager.go:319] Starting recovery of all containers Apr 21 04:24:08.390370 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.390347 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-45.ec2.internal\" not found" node="ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.399418 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.399309 2579 manager.go:324] Recovery completed Apr 21 04:24:08.403408 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.403397 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:24:08.405712 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.405697 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:24:08.405796 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.405728 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:24:08.405796 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.405743 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:24:08.406232 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.406219 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 04:24:08.406277 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.406233 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 04:24:08.406277 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.406252 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:24:08.409582 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.409571 2579 policy_none.go:49] "None policy: Start" Apr 21 04:24:08.409628 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.409588 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 04:24:08.409628 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.409602 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 21 04:24:08.447578 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.447549 2579 manager.go:341] "Starting Device Plugin manager" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.447609 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.447624 2579 server.go:85] "Starting device plugin registration server" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.447896 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.447912 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.448016 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.448108 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.448116 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.448592 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 04:24:08.462329 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.448627 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:08.506106 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.506064 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 04:24:08.507478 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.507455 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 04:24:08.507478 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.507482 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 04:24:08.507644 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.507501 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 04:24:08.507644 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.507509 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 04:24:08.507644 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.507550 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 04:24:08.509512 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.509488 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:24:08.548414 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.548310 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:24:08.549448 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.549432 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:24:08.549533 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.549463 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:24:08.549533 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.549474 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:24:08.549533 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.549504 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.556227 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.556213 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.556284 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.556237 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-45.ec2.internal\": node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:08.572026 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.572008 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:08.608212 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.608189 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal"] Apr 21 04:24:08.608285 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.608265 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:24:08.609345 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.609331 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:24:08.609417 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.609360 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:24:08.609417 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.609374 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:24:08.611550 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.611537 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:24:08.611695 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.611681 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.611752 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.611708 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:24:08.612316 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.612298 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:24:08.612391 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.612331 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:24:08.612391 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.612342 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:24:08.612391 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.612298 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:24:08.612485 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.612405 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:24:08.612485 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.612419 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:24:08.614381 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.614367 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.614432 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.614391 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:24:08.615022 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.615005 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:24:08.615106 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.615035 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:24:08.615106 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.615046 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:24:08.641696 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.641677 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-45.ec2.internal\" not found" node="ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.646133 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.646118 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-45.ec2.internal\" not found" node="ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.673013 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.672967 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:08.688725 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.688694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d50a317536c7fa826446cd19a9f98c46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal\" (UID: \"d50a317536c7fa826446cd19a9f98c46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.688725 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.688724 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d50a317536c7fa826446cd19a9f98c46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal\" (UID: \"d50a317536c7fa826446cd19a9f98c46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.688925 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.688742 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec711311d3275e119b3dff245c5b47c4-config\") pod \"kube-apiserver-proxy-ip-10-0-134-45.ec2.internal\" (UID: \"ec711311d3275e119b3dff245c5b47c4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.773486 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.773444 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:08.789905 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.789877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d50a317536c7fa826446cd19a9f98c46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal\" (UID: \"d50a317536c7fa826446cd19a9f98c46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.790034 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.789911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d50a317536c7fa826446cd19a9f98c46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal\" (UID: \"d50a317536c7fa826446cd19a9f98c46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.790034 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.789928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec711311d3275e119b3dff245c5b47c4-config\") pod \"kube-apiserver-proxy-ip-10-0-134-45.ec2.internal\" (UID: \"ec711311d3275e119b3dff245c5b47c4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.790034 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.789970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec711311d3275e119b3dff245c5b47c4-config\") pod \"kube-apiserver-proxy-ip-10-0-134-45.ec2.internal\" (UID: \"ec711311d3275e119b3dff245c5b47c4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.790034 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.790021 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d50a317536c7fa826446cd19a9f98c46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal\" (UID: \"d50a317536c7fa826446cd19a9f98c46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.790162 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.790021 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d50a317536c7fa826446cd19a9f98c46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal\" (UID: \"d50a317536c7fa826446cd19a9f98c46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.874396 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.874319 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:08.943866 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.943845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.948239 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:08.948215 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" Apr 21 04:24:08.974814 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:08.974780 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:09.075339 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.075284 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:09.175919 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.175821 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-45.ec2.internal\" not found" Apr 21 04:24:09.186776 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.186756 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:24:09.240476 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.240448 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:24:09.286323 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.286297 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" Apr 21 04:24:09.293749 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.293731 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:24:09.294791 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.294776 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:09.300408 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.300387 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 04:24:09.300531 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.300516 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:24:09.300585 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.300549 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:24:09.300585 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.300565 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:24:09.300585 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.300549 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:24:09.300689 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.300585 2579 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ad6492a8bbe2b4f8abb4fa1aa11deb95-4ab89290c172be9e.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.134.45:42922->54.83.173.85:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" Apr 21 04:24:09.364777 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.364743 2579 apiserver.go:52] "Watching apiserver" Apr 21 04:24:09.376752 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.376702 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 04:19:08 +0000 UTC" deadline="2027-12-26 20:56:47.911509381 +0000 UTC" Apr 21 04:24:09.376752 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.376747 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14752h32m38.534765522s" Apr 21 04:24:09.382081 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.382058 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 04:24:09.383582 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.383561 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-26djf","openshift-multus/multus-additional-cni-plugins-4dbmj","openshift-multus/network-metrics-daemon-lkbgz","kube-system/konnectivity-agent-xktdv","openshift-image-registry/node-ca-cp8gk","openshift-network-diagnostics/network-check-target-5zmm9","openshift-network-operator/iptables-alerter-sl7gg","openshift-ovn-kubernetes/ovnkube-node-bqlz8","kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs","openshift-cluster-node-tuning-operator/tuned-44k95","openshift-dns/node-resolver-h5xvg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal"] Apr 21 04:24:09.386095 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.386078 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 04:24:09.386278 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.386258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.389081 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.389053 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 04:24:09.389217 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.389081 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 04:24:09.389217 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.389060 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:24:09.389296 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.389245 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jl4d4\"" Apr 21 04:24:09.389565 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.389549 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.391703 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.391684 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 04:24:09.391703 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.391702 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 04:24:09.391880 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.391739 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 04:24:09.391880 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.391691 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 04:24:09.392001 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.391880 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-27vpl\"" Apr 21 04:24:09.392001 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.391927 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 04:24:09.393569 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393551 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-os-release\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.393641 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.393641 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393597 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.393641 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393615 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-system-cni-dir\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.393762 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.393762 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.393762 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393724 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnlc4\" (UniqueName: \"kubernetes.io/projected/861cd341-7a89-4a04-83c3-db9adea07535-kube-api-access-dnlc4\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.393762 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-iptables-alerter-script\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.393946 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-host-slash\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.393946 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393828 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdl28\" (UniqueName: \"kubernetes.io/projected/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-kube-api-access-hdl28\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.393946 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.393854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-cnibin\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.394069 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.394034 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:09.394069 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.394049 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:09.394126 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.394094 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:09.395914 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.395894 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dqs72\"" Apr 21 04:24:09.396085 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.395963 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 04:24:09.396085 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.396005 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 04:24:09.396374 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.396360 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.397699 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.397683 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:24:09.398188 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.398172 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 04:24:09.398374 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.398361 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 04:24:09.398514 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.398500 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x5h7q\"" Apr 21 04:24:09.398625 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.398608 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 04:24:09.398757 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.398743 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:09.398832 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.398811 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:09.401649 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.401630 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-26djf" Apr 21 04:24:09.404097 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.403484 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 04:24:09.404097 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.403700 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vfg2r\"" Apr 21 04:24:09.404446 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.404431 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.406386 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.406370 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 04:24:09.406601 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.406589 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 04:24:09.406675 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.406590 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h5tb7\"" Apr 21 04:24:09.406675 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.406664 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 04:24:09.406825 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.406677 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 04:24:09.406825 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.406711 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 04:24:09.406825 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.406762 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 04:24:09.407311 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.407296 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.409401 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.409380 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 04:24:09.409541 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.409526 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 04:24:09.409615 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.409557 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 04:24:09.409615 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.409609 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8jrn8\"" Apr 21 04:24:09.409730 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.409621 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.411591 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.411578 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:24:09.412331 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.412102 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nl5wz\"" Apr 21 04:24:09.412331 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.412220 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 04:24:09.412613 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.412591 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.415348 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.415329 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 04:24:09.415436 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.415418 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7mcsw\"" Apr 21 04:24:09.415497 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.415471 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 04:24:09.419637 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.419619 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dq7qb" Apr 21 04:24:09.429155 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.429137 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dq7qb" Apr 21 04:24:09.488205 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.488183 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 04:24:09.494174 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494153 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-ovn\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.494283 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494184 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-cni-bin\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.494283 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494211 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-conf-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.494283 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.494410 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.494410 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494327 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxx8w\" (UniqueName: \"kubernetes.io/projected/b6a42934-9530-4770-ba35-f71911b5b3b3-kube-api-access-nxx8w\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.494410 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9kb7\" (UniqueName: \"kubernetes.io/projected/e59f55fa-7367-4248-84e9-9a065140d0bc-kube-api-access-b9kb7\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.494410 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494385 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.494589 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494485 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gx7z\" (UniqueName: \"kubernetes.io/projected/ebfdc6e1-781b-42a1-b442-ba40dfec626c-kube-api-access-6gx7z\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:09.494589 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-ovnkube-script-lib\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.494589 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-multus-certs\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.494589 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494575 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-etc-kubernetes\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.494741 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494616 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8641cd54-f81a-4c30-ba62-309b434777a4-agent-certs\") pod \"konnectivity-agent-xktdv\" (UID: \"8641cd54-f81a-4c30-ba62-309b434777a4\") " pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:09.494741 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:09.494741 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494677 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-netns\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.494741 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysctl-d\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.494741 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-sys\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.494966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-var-lib-kubelet\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.494966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-cnibin\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.494966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494801 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-systemd-units\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.494966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494821 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.494966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.494966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494843 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.494966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.494966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494945 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-socket-dir-parent\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.494971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-kubelet\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-hostroot\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-host\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e59f55fa-7367-4248-84e9-9a065140d0bc-tmp\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-cnibin\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495092 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6a42934-9530-4770-ba35-f71911b5b3b3-ovn-node-metrics-cert\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495130 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-etc-selinux\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-cnibin\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495153 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-run\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495189 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-system-cni-dir\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495234 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-system-cni-dir\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495245 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-iptables-alerter-script\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-node-log\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495284 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-cni-netd\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.495300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-env-overrides\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-device-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysctl-conf\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495378 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-cni-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee93154d-8192-4132-b610-52bffab2fc10-hosts-file\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495465 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8641cd54-f81a-4c30-ba62-309b434777a4-konnectivity-ca\") pod \"konnectivity-agent-xktdv\" (UID: \"8641cd54-f81a-4c30-ba62-309b434777a4\") " pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495489 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-os-release\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-systemd\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-system-cni-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495568 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-cni-binary-copy\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495588 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-cni-multus\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495607 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqtpb\" (UniqueName: \"kubernetes.io/projected/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-kube-api-access-qqtpb\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fkwt\" (UniqueName: \"kubernetes.io/projected/ee93154d-8192-4132-b610-52bffab2fc10-kube-api-access-8fkwt\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnlc4\" (UniqueName: \"kubernetes.io/projected/861cd341-7a89-4a04-83c3-db9adea07535-kube-api-access-dnlc4\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495664 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5fd5747-67eb-4782-b7d2-6b81f0d51528-serviceca\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495680 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-log-socket\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.495826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495705 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-iptables-alerter-script\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495769 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-ovnkube-config\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-daemon-config\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495813 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-sys-fs\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-tuned\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee93154d-8192-4132-b610-52bffab2fc10-tmp-dir\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-os-release\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5fd5747-67eb-4782-b7d2-6b81f0d51528-host\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495946 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-run-netns\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-os-release\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.495968 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-cni-bin\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496008 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-k8s-cni-cncf-io\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496034 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-kubernetes\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-host-slash\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-kubelet\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-host-slash\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-var-lib-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.496452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496130 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks98r\" (UniqueName: \"kubernetes.io/projected/ab782842-c406-437d-a62e-012b36593d16-kube-api-access-ks98r\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496177 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-registration-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496194 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-modprobe-d\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-systemd\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496252 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-lib-modules\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdl28\" (UniqueName: \"kubernetes.io/projected/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-kube-api-access-hdl28\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccxfx\" (UniqueName: \"kubernetes.io/projected/f5fd5747-67eb-4782-b7d2-6b81f0d51528-kube-api-access-ccxfx\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-slash\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496418 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/861cd341-7a89-4a04-83c3-db9adea07535-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/861cd341-7a89-4a04-83c3-db9adea07535-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496493 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-etc-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496526 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-socket-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.496959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.496546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysconfig\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.501188 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.501167 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 04:24:09.504089 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.504069 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdl28\" (UniqueName: \"kubernetes.io/projected/1bc20e1a-89b1-47ff-80e6-ae26cc68513c-kube-api-access-hdl28\") pod \"iptables-alerter-sl7gg\" (UID: \"1bc20e1a-89b1-47ff-80e6-ae26cc68513c\") " pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.504190 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.504172 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnlc4\" (UniqueName: \"kubernetes.io/projected/861cd341-7a89-4a04-83c3-db9adea07535-kube-api-access-dnlc4\") pod \"multus-additional-cni-plugins-4dbmj\" (UID: \"861cd341-7a89-4a04-83c3-db9adea07535\") " pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.541220 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.541196 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd50a317536c7fa826446cd19a9f98c46.slice/crio-bd1f2177cc7683a57dcf8f031e0e0ed7b27e3fbfdeda12da203cdf7b5e0c9db6 WatchSource:0}: Error finding container bd1f2177cc7683a57dcf8f031e0e0ed7b27e3fbfdeda12da203cdf7b5e0c9db6: Status 404 returned error can't find the container with id bd1f2177cc7683a57dcf8f031e0e0ed7b27e3fbfdeda12da203cdf7b5e0c9db6 Apr 21 04:24:09.541638 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.541625 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec711311d3275e119b3dff245c5b47c4.slice/crio-b41b04639c791e47d31177b95abc06c7cceeff8a4e1edca847beaa697851034d WatchSource:0}: Error finding container b41b04639c791e47d31177b95abc06c7cceeff8a4e1edca847beaa697851034d: Status 404 returned error can't find the container with id b41b04639c791e47d31177b95abc06c7cceeff8a4e1edca847beaa697851034d Apr 21 04:24:09.547896 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.546267 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:24:09.596926 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.596892 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-systemd-units\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.596926 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.596926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.596943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.596966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-socket-dir-parent\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-systemd-units\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597023 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597029 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-kubelet\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-socket-dir-parent\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-hostroot\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597073 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-kubelet\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597066 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-host\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-hostroot\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597113 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e59f55fa-7367-4248-84e9-9a065140d0bc-tmp\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597131 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-host\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.597143 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597140 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6a42934-9530-4770-ba35-f71911b5b3b3-ovn-node-metrics-cert\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-etc-selinux\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-run\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-node-log\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-cni-netd\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597260 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-run\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-env-overrides\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597296 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-node-log\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-cni-netd\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-device-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597339 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-device-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-etc-selinux\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597368 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysctl-conf\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597391 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-cni-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee93154d-8192-4132-b610-52bffab2fc10-hosts-file\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597458 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8641cd54-f81a-4c30-ba62-309b434777a4-konnectivity-ca\") pod \"konnectivity-agent-xktdv\" (UID: \"8641cd54-f81a-4c30-ba62-309b434777a4\") " pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-os-release\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.597782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-cni-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597501 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-systemd\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597504 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-system-cni-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-cni-binary-copy\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-cni-multus\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee93154d-8192-4132-b610-52bffab2fc10-hosts-file\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597575 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-os-release\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597523 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysctl-conf\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqtpb\" (UniqueName: \"kubernetes.io/projected/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-kube-api-access-qqtpb\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fkwt\" (UniqueName: \"kubernetes.io/projected/ee93154d-8192-4132-b610-52bffab2fc10-kube-api-access-8fkwt\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-system-cni-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597675 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5fd5747-67eb-4782-b7d2-6b81f0d51528-serviceca\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-systemd\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-env-overrides\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-log-socket\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597744 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-ovnkube-config\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-daemon-config\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.598598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-sys-fs\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-tuned\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597874 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee93154d-8192-4132-b610-52bffab2fc10-tmp-dir\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5fd5747-67eb-4782-b7d2-6b81f0d51528-host\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-cni-multus\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-log-socket\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.597934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-run-netns\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-cni-bin\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-k8s-cni-cncf-io\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598064 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8641cd54-f81a-4c30-ba62-309b434777a4-konnectivity-ca\") pod \"konnectivity-agent-xktdv\" (UID: \"8641cd54-f81a-4c30-ba62-309b434777a4\") " pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-kubernetes\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-kubelet\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-run-netns\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-var-lib-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598162 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5fd5747-67eb-4782-b7d2-6b81f0d51528-host\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5fd5747-67eb-4782-b7d2-6b81f0d51528-serviceca\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks98r\" (UniqueName: \"kubernetes.io/projected/ab782842-c406-437d-a62e-012b36593d16-kube-api-access-ks98r\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-registration-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.599469 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598234 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-sys-fs\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-kubernetes\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-modprobe-d\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-cni-bin\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-systemd\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598353 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-lib-modules\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee93154d-8192-4132-b610-52bffab2fc10-tmp-dir\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-registration-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccxfx\" (UniqueName: \"kubernetes.io/projected/f5fd5747-67eb-4782-b7d2-6b81f0d51528-kube-api-access-ccxfx\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-slash\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598490 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-etc-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.598495 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-socket-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598542 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysconfig\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-cni-binary-copy\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.598582 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs podName:ebfdc6e1-781b-42a1-b442-ba40dfec626c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:10.098551352 +0000 UTC m=+2.093948974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs") pod "network-metrics-daemon-lkbgz" (UID: "ebfdc6e1-781b-42a1-b442-ba40dfec626c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:09.600394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598583 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysconfig\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598352 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-k8s-cni-cncf-io\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-etc-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-var-lib-openvswitch\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-ovn\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-cni-bin\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-conf-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598735 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598741 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-lib-modules\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxx8w\" (UniqueName: \"kubernetes.io/projected/b6a42934-9530-4770-ba35-f71911b5b3b3-kube-api-access-nxx8w\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9kb7\" (UniqueName: \"kubernetes.io/projected/e59f55fa-7367-4248-84e9-9a065140d0bc-kube-api-access-b9kb7\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598813 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-modprobe-d\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab782842-c406-437d-a62e-012b36593d16-socket-dir\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gx7z\" (UniqueName: \"kubernetes.io/projected/ebfdc6e1-781b-42a1-b442-ba40dfec626c-kube-api-access-6gx7z\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598862 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-slash\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598895 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-run-ovn\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598900 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-ovnkube-config\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-ovnkube-script-lib\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-var-lib-cni-bin\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.598970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-systemd\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-conf-dir\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6a42934-9530-4770-ba35-f71911b5b3b3-host-kubelet\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-multus-certs\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-etc-kubernetes\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8641cd54-f81a-4c30-ba62-309b434777a4-agent-certs\") pod \"konnectivity-agent-xktdv\" (UID: \"8641cd54-f81a-4c30-ba62-309b434777a4\") " pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599196 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-multus-certs\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599212 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-etc-kubernetes\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599271 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-netns\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysctl-d\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-sys\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599345 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-var-lib-kubelet\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-cnibin\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599436 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b6a42934-9530-4770-ba35-f71911b5b3b3-ovnkube-script-lib\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.601716 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-cnibin\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-host-run-netns\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-sysctl-d\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599573 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-sys\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e59f55fa-7367-4248-84e9-9a065140d0bc-var-lib-kubelet\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-multus-daemon-config\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.599838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e59f55fa-7367-4248-84e9-9a065140d0bc-tmp\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.600035 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6a42934-9530-4770-ba35-f71911b5b3b3-ovn-node-metrics-cert\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.600703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e59f55fa-7367-4248-84e9-9a065140d0bc-etc-tuned\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.602241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.601574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8641cd54-f81a-4c30-ba62-309b434777a4-agent-certs\") pod \"konnectivity-agent-xktdv\" (UID: \"8641cd54-f81a-4c30-ba62-309b434777a4\") " pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:09.605012 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.604975 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fkwt\" (UniqueName: \"kubernetes.io/projected/ee93154d-8192-4132-b610-52bffab2fc10-kube-api-access-8fkwt\") pod \"node-resolver-h5xvg\" (UID: \"ee93154d-8192-4132-b610-52bffab2fc10\") " pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.605677 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.605656 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:09.605772 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.605683 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:09.605772 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.605698 2579 projected.go:194] Error preparing data for projected volume kube-api-access-z8txf for pod openshift-network-diagnostics/network-check-target-5zmm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:09.605772 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:09.605763 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf podName:82c4d960-14d9-4cfe-80d2-1d0751997a7d nodeName:}" failed. No retries permitted until 2026-04-21 04:24:10.105743259 +0000 UTC m=+2.101140865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z8txf" (UniqueName: "kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf") pod "network-check-target-5zmm9" (UID: "82c4d960-14d9-4cfe-80d2-1d0751997a7d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:09.607279 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.607253 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9kb7\" (UniqueName: \"kubernetes.io/projected/e59f55fa-7367-4248-84e9-9a065140d0bc-kube-api-access-b9kb7\") pod \"tuned-44k95\" (UID: \"e59f55fa-7367-4248-84e9-9a065140d0bc\") " pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.607692 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.607674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqtpb\" (UniqueName: \"kubernetes.io/projected/9f8dd91d-0597-4a80-87f6-46b919d9a0ef-kube-api-access-qqtpb\") pod \"multus-26djf\" (UID: \"9f8dd91d-0597-4a80-87f6-46b919d9a0ef\") " pod="openshift-multus/multus-26djf" Apr 21 04:24:09.607848 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.607833 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks98r\" (UniqueName: \"kubernetes.io/projected/ab782842-c406-437d-a62e-012b36593d16-kube-api-access-ks98r\") pod \"aws-ebs-csi-driver-node-ltpgs\" (UID: \"ab782842-c406-437d-a62e-012b36593d16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.608281 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.608255 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gx7z\" (UniqueName: \"kubernetes.io/projected/ebfdc6e1-781b-42a1-b442-ba40dfec626c-kube-api-access-6gx7z\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:09.608281 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.608274 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccxfx\" (UniqueName: \"kubernetes.io/projected/f5fd5747-67eb-4782-b7d2-6b81f0d51528-kube-api-access-ccxfx\") pod \"node-ca-cp8gk\" (UID: \"f5fd5747-67eb-4782-b7d2-6b81f0d51528\") " pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.608394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.608330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxx8w\" (UniqueName: \"kubernetes.io/projected/b6a42934-9530-4770-ba35-f71911b5b3b3-kube-api-access-nxx8w\") pod \"ovnkube-node-bqlz8\" (UID: \"b6a42934-9530-4770-ba35-f71911b5b3b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.719418 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.719332 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sl7gg" Apr 21 04:24:09.725302 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.725276 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bc20e1a_89b1_47ff_80e6_ae26cc68513c.slice/crio-115fb6d6b62b071ee61f0d08163f9fbd714f5f8901f95fa485055fd7c463b695 WatchSource:0}: Error finding container 115fb6d6b62b071ee61f0d08163f9fbd714f5f8901f95fa485055fd7c463b695: Status 404 returned error can't find the container with id 115fb6d6b62b071ee61f0d08163f9fbd714f5f8901f95fa485055fd7c463b695 Apr 21 04:24:09.729232 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.729212 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" Apr 21 04:24:09.735352 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.735326 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod861cd341_7a89_4a04_83c3_db9adea07535.slice/crio-085baed922df2fd1223cb10b5b9c6921538b5eff90ce9463aa89f8b35e02c1d8 WatchSource:0}: Error finding container 085baed922df2fd1223cb10b5b9c6921538b5eff90ce9463aa89f8b35e02c1d8: Status 404 returned error can't find the container with id 085baed922df2fd1223cb10b5b9c6921538b5eff90ce9463aa89f8b35e02c1d8 Apr 21 04:24:09.741107 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.741089 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:09.746871 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.746838 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8641cd54_f81a_4c30_ba62_309b434777a4.slice/crio-3e3228e1cb46edc033c995de3ec1e4ab922ea7ddb15c1daa64356e87564b7bab WatchSource:0}: Error finding container 3e3228e1cb46edc033c995de3ec1e4ab922ea7ddb15c1daa64356e87564b7bab: Status 404 returned error can't find the container with id 3e3228e1cb46edc033c995de3ec1e4ab922ea7ddb15c1daa64356e87564b7bab Apr 21 04:24:09.752682 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.752665 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cp8gk" Apr 21 04:24:09.758371 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.758348 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5fd5747_67eb_4782_b7d2_6b81f0d51528.slice/crio-dafe9bfe7880222d591add45b6f9a868af1be8db599398506094e3267d23bf7d WatchSource:0}: Error finding container dafe9bfe7880222d591add45b6f9a868af1be8db599398506094e3267d23bf7d: Status 404 returned error can't find the container with id dafe9bfe7880222d591add45b6f9a868af1be8db599398506094e3267d23bf7d Apr 21 04:24:09.771282 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.771259 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-26djf" Apr 21 04:24:09.777033 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.777010 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:09.777654 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.777541 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8dd91d_0597_4a80_87f6_46b919d9a0ef.slice/crio-c5f7ae0e8c43d100b85bf95681827f5213377d705c38221c66ca4873d346d2f5 WatchSource:0}: Error finding container c5f7ae0e8c43d100b85bf95681827f5213377d705c38221c66ca4873d346d2f5: Status 404 returned error can't find the container with id c5f7ae0e8c43d100b85bf95681827f5213377d705c38221c66ca4873d346d2f5 Apr 21 04:24:09.784107 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.784084 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6a42934_9530_4770_ba35_f71911b5b3b3.slice/crio-a04de400e196481a692e08796bba1430e598e2748960d555fdabeb6cbf5c47a3 WatchSource:0}: Error finding container a04de400e196481a692e08796bba1430e598e2748960d555fdabeb6cbf5c47a3: Status 404 returned error can't find the container with id a04de400e196481a692e08796bba1430e598e2748960d555fdabeb6cbf5c47a3 Apr 21 04:24:09.811856 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.811827 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" Apr 21 04:24:09.817712 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.817684 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab782842_c406_437d_a62e_012b36593d16.slice/crio-de63e41c26d0b5e4f93e38221bf02aeed2cc95331b2be6814be35346c59cad69 WatchSource:0}: Error finding container de63e41c26d0b5e4f93e38221bf02aeed2cc95331b2be6814be35346c59cad69: Status 404 returned error can't find the container with id de63e41c26d0b5e4f93e38221bf02aeed2cc95331b2be6814be35346c59cad69 Apr 21 04:24:09.823655 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.823636 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-44k95" Apr 21 04:24:09.827636 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:09.827614 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h5xvg" Apr 21 04:24:09.829479 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.829452 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59f55fa_7367_4248_84e9_9a065140d0bc.slice/crio-aeff69e5d4d73dc06f0d0afc266c82bc9f33e4366315da4a6827a2afaebe73ac WatchSource:0}: Error finding container aeff69e5d4d73dc06f0d0afc266c82bc9f33e4366315da4a6827a2afaebe73ac: Status 404 returned error can't find the container with id aeff69e5d4d73dc06f0d0afc266c82bc9f33e4366315da4a6827a2afaebe73ac Apr 21 04:24:09.833868 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:24:09.833846 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee93154d_8192_4132_b610_52bffab2fc10.slice/crio-14f9cc731a816bd4d40d42dbc2fdf2c147b90b3cd2492528bfd5ea976642281f WatchSource:0}: Error finding container 14f9cc731a816bd4d40d42dbc2fdf2c147b90b3cd2492528bfd5ea976642281f: Status 404 returned error can't find the container with id 14f9cc731a816bd4d40d42dbc2fdf2c147b90b3cd2492528bfd5ea976642281f Apr 21 04:24:10.104589 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.103872 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:10.104589 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:10.104112 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:10.104589 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:10.104180 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs podName:ebfdc6e1-781b-42a1-b442-ba40dfec626c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:11.104160391 +0000 UTC m=+3.099557995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs") pod "network-metrics-daemon-lkbgz" (UID: "ebfdc6e1-781b-42a1-b442-ba40dfec626c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:10.204879 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.204842 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:10.205073 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:10.205017 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:10.205073 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:10.205046 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:10.205073 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:10.205075 2579 projected.go:194] Error preparing data for projected volume kube-api-access-z8txf for pod openshift-network-diagnostics/network-check-target-5zmm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:10.205255 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:10.205131 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf podName:82c4d960-14d9-4cfe-80d2-1d0751997a7d nodeName:}" failed. No retries permitted until 2026-04-21 04:24:11.205112179 +0000 UTC m=+3.200509789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8txf" (UniqueName: "kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf") pod "network-check-target-5zmm9" (UID: "82c4d960-14d9-4cfe-80d2-1d0751997a7d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:10.429940 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.429799 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:19:09 +0000 UTC" deadline="2027-10-17 15:52:43.177134186 +0000 UTC" Apr 21 04:24:10.429940 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.429843 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13067h28m32.747295282s" Apr 21 04:24:10.545383 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.545270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"a04de400e196481a692e08796bba1430e598e2748960d555fdabeb6cbf5c47a3"} Apr 21 04:24:10.562283 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.562210 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-26djf" event={"ID":"9f8dd91d-0597-4a80-87f6-46b919d9a0ef","Type":"ContainerStarted","Data":"c5f7ae0e8c43d100b85bf95681827f5213377d705c38221c66ca4873d346d2f5"} Apr 21 04:24:10.568953 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.568736 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:24:10.570921 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.570887 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cp8gk" event={"ID":"f5fd5747-67eb-4782-b7d2-6b81f0d51528","Type":"ContainerStarted","Data":"dafe9bfe7880222d591add45b6f9a868af1be8db599398506094e3267d23bf7d"} Apr 21 04:24:10.589036 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.588995 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sl7gg" event={"ID":"1bc20e1a-89b1-47ff-80e6-ae26cc68513c","Type":"ContainerStarted","Data":"115fb6d6b62b071ee61f0d08163f9fbd714f5f8901f95fa485055fd7c463b695"} Apr 21 04:24:10.616585 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.616255 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h5xvg" event={"ID":"ee93154d-8192-4132-b610-52bffab2fc10","Type":"ContainerStarted","Data":"14f9cc731a816bd4d40d42dbc2fdf2c147b90b3cd2492528bfd5ea976642281f"} Apr 21 04:24:10.643936 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.643898 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xktdv" event={"ID":"8641cd54-f81a-4c30-ba62-309b434777a4","Type":"ContainerStarted","Data":"3e3228e1cb46edc033c995de3ec1e4ab922ea7ddb15c1daa64356e87564b7bab"} Apr 21 04:24:10.652850 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.652793 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" event={"ID":"861cd341-7a89-4a04-83c3-db9adea07535","Type":"ContainerStarted","Data":"085baed922df2fd1223cb10b5b9c6921538b5eff90ce9463aa89f8b35e02c1d8"} Apr 21 04:24:10.655898 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.655822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" event={"ID":"d50a317536c7fa826446cd19a9f98c46","Type":"ContainerStarted","Data":"bd1f2177cc7683a57dcf8f031e0e0ed7b27e3fbfdeda12da203cdf7b5e0c9db6"} Apr 21 04:24:10.680194 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.680104 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" event={"ID":"ec711311d3275e119b3dff245c5b47c4","Type":"ContainerStarted","Data":"b41b04639c791e47d31177b95abc06c7cceeff8a4e1edca847beaa697851034d"} Apr 21 04:24:10.689448 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.689403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-44k95" event={"ID":"e59f55fa-7367-4248-84e9-9a065140d0bc","Type":"ContainerStarted","Data":"aeff69e5d4d73dc06f0d0afc266c82bc9f33e4366315da4a6827a2afaebe73ac"} Apr 21 04:24:10.699751 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.699713 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" event={"ID":"ab782842-c406-437d-a62e-012b36593d16","Type":"ContainerStarted","Data":"de63e41c26d0b5e4f93e38221bf02aeed2cc95331b2be6814be35346c59cad69"} Apr 21 04:24:10.873735 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:10.873701 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:24:11.112595 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:11.112274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:11.112595 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:11.112448 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:11.112595 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:11.112512 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs podName:ebfdc6e1-781b-42a1-b442-ba40dfec626c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:13.112492833 +0000 UTC m=+5.107890440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs") pod "network-metrics-daemon-lkbgz" (UID: "ebfdc6e1-781b-42a1-b442-ba40dfec626c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:11.213610 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:11.213575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:11.213795 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:11.213755 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:11.213795 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:11.213786 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:11.213898 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:11.213800 2579 projected.go:194] Error preparing data for projected volume kube-api-access-z8txf for pod openshift-network-diagnostics/network-check-target-5zmm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:11.213898 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:11.213861 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf podName:82c4d960-14d9-4cfe-80d2-1d0751997a7d nodeName:}" failed. No retries permitted until 2026-04-21 04:24:13.213840834 +0000 UTC m=+5.209238439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8txf" (UniqueName: "kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf") pod "network-check-target-5zmm9" (UID: "82c4d960-14d9-4cfe-80d2-1d0751997a7d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:11.430582 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:11.430486 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:19:09 +0000 UTC" deadline="2027-10-17 01:48:42.536953701 +0000 UTC" Apr 21 04:24:11.430582 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:11.430528 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13053h24m31.106429501s" Apr 21 04:24:11.507865 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:11.507830 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:11.508113 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:11.508094 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:11.508199 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:11.508179 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:11.508276 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:11.507969 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:11.604463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:11.603787 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:24:11.604463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:11.604279 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:24:13.130025 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:13.129968 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:13.130497 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:13.130140 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:13.130497 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:13.130226 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs podName:ebfdc6e1-781b-42a1-b442-ba40dfec626c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:17.130203266 +0000 UTC m=+9.125600871 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs") pod "network-metrics-daemon-lkbgz" (UID: "ebfdc6e1-781b-42a1-b442-ba40dfec626c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:13.230950 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:13.230911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:13.231160 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:13.231137 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:13.231238 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:13.231166 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:13.231238 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:13.231179 2579 projected.go:194] Error preparing data for projected volume kube-api-access-z8txf for pod openshift-network-diagnostics/network-check-target-5zmm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:13.231340 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:13.231245 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf podName:82c4d960-14d9-4cfe-80d2-1d0751997a7d nodeName:}" failed. No retries permitted until 2026-04-21 04:24:17.231223843 +0000 UTC m=+9.226621459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8txf" (UniqueName: "kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf") pod "network-check-target-5zmm9" (UID: "82c4d960-14d9-4cfe-80d2-1d0751997a7d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:13.508202 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:13.508114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:13.508377 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:13.508114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:13.508377 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:13.508264 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:13.508377 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:13.508355 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:15.509021 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:15.508601 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:15.509021 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:15.508722 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:15.509021 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:15.508845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:15.509021 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:15.508936 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:17.165758 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:17.165649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:17.166243 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:17.165866 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:17.166243 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:17.165949 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs podName:ebfdc6e1-781b-42a1-b442-ba40dfec626c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:25.165927288 +0000 UTC m=+17.161324957 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs") pod "network-metrics-daemon-lkbgz" (UID: "ebfdc6e1-781b-42a1-b442-ba40dfec626c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:17.266472 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:17.266433 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:17.266642 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:17.266591 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:17.266642 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:17.266610 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:17.266642 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:17.266623 2579 projected.go:194] Error preparing data for projected volume kube-api-access-z8txf for pod openshift-network-diagnostics/network-check-target-5zmm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:17.266821 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:17.266689 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf podName:82c4d960-14d9-4cfe-80d2-1d0751997a7d nodeName:}" failed. No retries permitted until 2026-04-21 04:24:25.266661339 +0000 UTC m=+17.262058949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8txf" (UniqueName: "kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf") pod "network-check-target-5zmm9" (UID: "82c4d960-14d9-4cfe-80d2-1d0751997a7d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:17.508404 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:17.508314 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:17.508569 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:17.508452 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:17.508862 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:17.508841 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:17.508963 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:17.508944 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:19.508200 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:19.508150 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:19.508636 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:19.508269 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:19.508636 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:19.508625 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:19.508788 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:19.508763 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:21.508079 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:21.508037 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:21.508540 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:21.508045 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:21.508540 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:21.508162 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:21.508540 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:21.508255 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:23.508503 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:23.508296 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:23.508900 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:23.508298 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:23.508900 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:23.508603 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:23.508900 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:23.508695 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:25.224570 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:25.224520 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:25.225081 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:25.224688 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:25.225081 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:25.224776 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs podName:ebfdc6e1-781b-42a1-b442-ba40dfec626c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:41.224754632 +0000 UTC m=+33.220152238 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs") pod "network-metrics-daemon-lkbgz" (UID: "ebfdc6e1-781b-42a1-b442-ba40dfec626c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:25.325990 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:25.325934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:25.326277 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:25.326123 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:25.326277 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:25.326145 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:25.326277 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:25.326158 2579 projected.go:194] Error preparing data for projected volume kube-api-access-z8txf for pod openshift-network-diagnostics/network-check-target-5zmm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:25.326277 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:25.326228 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf podName:82c4d960-14d9-4cfe-80d2-1d0751997a7d nodeName:}" failed. No retries permitted until 2026-04-21 04:24:41.326206794 +0000 UTC m=+33.321604406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8txf" (UniqueName: "kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf") pod "network-check-target-5zmm9" (UID: "82c4d960-14d9-4cfe-80d2-1d0751997a7d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:25.508611 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:25.508518 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:25.508611 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:25.508600 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:25.508830 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:25.508719 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:25.508916 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:25.508876 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:27.508112 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:27.508085 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:27.508466 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:27.508193 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:27.508466 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:27.508243 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:27.508466 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:27.508322 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:28.731087 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.731058 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:24:28.731845 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.731312 2579 generic.go:358] "Generic (PLEG): container finished" podID="b6a42934-9530-4770-ba35-f71911b5b3b3" containerID="d4b67ac94665e3b6f9f52baa9e214d0f4f259ad5199858895be7ae1f7ee82829" exitCode=1 Apr 21 04:24:28.731845 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.731369 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"cdd11c7604b0333eff2232ad815a9ae3920c27a2ace871b183bbad4d6348bf30"} Apr 21 04:24:28.731845 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.731389 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"21b7b67e0f7b539f5f600beaeb3182c20f31f0b9b4545887b418950ac8a18a64"} Apr 21 04:24:28.731845 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.731398 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"f72ad2ca37d914d5fc5e4226829bdaaf9b1365c114dd9ae96729b14658c195ef"} Apr 21 04:24:28.731845 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.731408 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"1cfddd024047313db672daa9985ad4bb02b84c3aa236a100af964b3372ede11c"} Apr 21 04:24:28.731845 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.731421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerDied","Data":"d4b67ac94665e3b6f9f52baa9e214d0f4f259ad5199858895be7ae1f7ee82829"} Apr 21 04:24:28.731845 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.731436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"549bd1b70898fe7da9d422ec0352464c36d2375eead56dd220cd1c327eb8f57b"} Apr 21 04:24:28.732567 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.732539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-26djf" event={"ID":"9f8dd91d-0597-4a80-87f6-46b919d9a0ef","Type":"ContainerStarted","Data":"8f9d2d9c8f562cec29c6563f98babce2f89afb93ae4c86c9d9f15f1c5a2f3861"} Apr 21 04:24:28.733722 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.733700 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" event={"ID":"ec711311d3275e119b3dff245c5b47c4","Type":"ContainerStarted","Data":"4d854a9d2819dc263410f7a3938c6fc2e577a8d07d30a0ad4ddda6100f0e6bac"} Apr 21 04:24:28.734886 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.734855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-44k95" event={"ID":"e59f55fa-7367-4248-84e9-9a065140d0bc","Type":"ContainerStarted","Data":"3289a5b43948b9c57073d8902e9d1ab02658444595f5ee00701ee7f4ec5df4b2"} Apr 21 04:24:28.749698 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.747831 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-26djf" podStartSLOduration=2.490523178 podStartE2EDuration="20.747816802s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.780548507 +0000 UTC m=+1.775946110" lastFinishedPulling="2026-04-21 04:24:28.037842113 +0000 UTC m=+20.033239734" observedRunningTime="2026-04-21 04:24:28.747622345 +0000 UTC m=+20.743019971" watchObservedRunningTime="2026-04-21 04:24:28.747816802 +0000 UTC m=+20.743214428" Apr 21 04:24:28.765557 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.765511 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-44k95" podStartSLOduration=2.706994309 podStartE2EDuration="20.765494903s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.830973001 +0000 UTC m=+1.826370604" lastFinishedPulling="2026-04-21 04:24:27.889473581 +0000 UTC m=+19.884871198" observedRunningTime="2026-04-21 04:24:28.765099976 +0000 UTC m=+20.760497600" watchObservedRunningTime="2026-04-21 04:24:28.765494903 +0000 UTC m=+20.760892528" Apr 21 04:24:28.777676 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:28.777633 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-45.ec2.internal" podStartSLOduration=19.777615703 podStartE2EDuration="19.777615703s" podCreationTimestamp="2026-04-21 04:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:24:28.777499213 +0000 UTC m=+20.772896838" watchObservedRunningTime="2026-04-21 04:24:28.777615703 +0000 UTC m=+20.773013340" Apr 21 04:24:29.508243 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.508201 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:29.508399 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:29.508343 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:29.508477 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.508402 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:29.508544 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:29.508522 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:29.737901 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.737872 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" event={"ID":"ab782842-c406-437d-a62e-012b36593d16","Type":"ContainerStarted","Data":"b3e823635ae8532bf2a400fc5bee1dfe86c1909f42d30e9f2baca0fd933942c3"} Apr 21 04:24:29.739025 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.739001 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cp8gk" event={"ID":"f5fd5747-67eb-4782-b7d2-6b81f0d51528","Type":"ContainerStarted","Data":"5f805b61381f1933139b45c2f58fe415f2a67d09cfadff4a7d50cb1b7e2b835e"} Apr 21 04:24:29.740265 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.740243 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sl7gg" event={"ID":"1bc20e1a-89b1-47ff-80e6-ae26cc68513c","Type":"ContainerStarted","Data":"6da1153d6be0083266db02625ecef4a370f3623f4840c24bd4673449b95cc7c6"} Apr 21 04:24:29.741318 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.741297 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h5xvg" event={"ID":"ee93154d-8192-4132-b610-52bffab2fc10","Type":"ContainerStarted","Data":"9b26335c41ef6b17b5ba825edf94e7fcd22e8fc4cda4278bf76d13313a7b687a"} Apr 21 04:24:29.742500 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.742480 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xktdv" event={"ID":"8641cd54-f81a-4c30-ba62-309b434777a4","Type":"ContainerStarted","Data":"8a558133be99ed7de497e4d1b93234e9e1c8e887ec645cf0d88a3aa3a90e6ca1"} Apr 21 04:24:29.743825 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.743801 2579 generic.go:358] "Generic (PLEG): container finished" podID="861cd341-7a89-4a04-83c3-db9adea07535" containerID="19254ff71c73804edf0b04c5b9e79cb85f7e6b239e318d5faa7b2a45c3f7ea4a" exitCode=0 Apr 21 04:24:29.743903 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.743879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" event={"ID":"861cd341-7a89-4a04-83c3-db9adea07535","Type":"ContainerDied","Data":"19254ff71c73804edf0b04c5b9e79cb85f7e6b239e318d5faa7b2a45c3f7ea4a"} Apr 21 04:24:29.745309 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.745287 2579 generic.go:358] "Generic (PLEG): container finished" podID="d50a317536c7fa826446cd19a9f98c46" containerID="b50fc9eb82d4d102a5a1b2326229da086f0668181ad9ab5ad580ea2ecb9b981f" exitCode=0 Apr 21 04:24:29.745382 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.745371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" event={"ID":"d50a317536c7fa826446cd19a9f98c46","Type":"ContainerDied","Data":"b50fc9eb82d4d102a5a1b2326229da086f0668181ad9ab5ad580ea2ecb9b981f"} Apr 21 04:24:29.751872 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.751822 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cp8gk" podStartSLOduration=3.543291213 podStartE2EDuration="21.75180636s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.759799758 +0000 UTC m=+1.755197361" lastFinishedPulling="2026-04-21 04:24:27.9683149 +0000 UTC m=+19.963712508" observedRunningTime="2026-04-21 04:24:29.751690753 +0000 UTC m=+21.747088377" watchObservedRunningTime="2026-04-21 04:24:29.75180636 +0000 UTC m=+21.747203986" Apr 21 04:24:29.758568 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.758546 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 04:24:29.765566 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.765519 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h5xvg" podStartSLOduration=3.6066991440000002 podStartE2EDuration="21.765504227s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.835316918 +0000 UTC m=+1.830714523" lastFinishedPulling="2026-04-21 04:24:27.994121986 +0000 UTC m=+19.989519606" observedRunningTime="2026-04-21 04:24:29.7650598 +0000 UTC m=+21.760457425" watchObservedRunningTime="2026-04-21 04:24:29.765504227 +0000 UTC m=+21.760901853" Apr 21 04:24:29.806081 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:29.806018 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xktdv" podStartSLOduration=3.664881335 podStartE2EDuration="21.806001561s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.748355459 +0000 UTC m=+1.743753061" lastFinishedPulling="2026-04-21 04:24:27.889475676 +0000 UTC m=+19.884873287" observedRunningTime="2026-04-21 04:24:29.805954389 +0000 UTC m=+21.801352018" watchObservedRunningTime="2026-04-21 04:24:29.806001561 +0000 UTC m=+21.801399190" Apr 21 04:24:30.461030 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.460821 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T04:24:29.758565989Z","UUID":"99b6aafc-08b3-48d5-bcbb-1accad9fdc5c","Handler":null,"Name":"","Endpoint":""} Apr 21 04:24:30.463172 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.463144 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 04:24:30.463172 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.463179 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 04:24:30.753415 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.753332 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" event={"ID":"d50a317536c7fa826446cd19a9f98c46","Type":"ContainerStarted","Data":"3429efe77ce6ef36e954aec08538306f5face552a58d9dce0564bd7092fcdfaa"} Apr 21 04:24:30.756862 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.756824 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" event={"ID":"ab782842-c406-437d-a62e-012b36593d16","Type":"ContainerStarted","Data":"7fd9f82209a3efa014324b28fe6cc9a75b49298a2c918563430c284d2ebe4a0f"} Apr 21 04:24:30.761315 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.761287 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:24:30.762135 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.762106 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"30804bcb043aeee7711bf3a80b5e8e045c32ee9a32be555971006c83663a1ce4"} Apr 21 04:24:30.767934 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.767861 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-45.ec2.internal" podStartSLOduration=21.767844657 podStartE2EDuration="21.767844657s" podCreationTimestamp="2026-04-21 04:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:24:30.767747373 +0000 UTC m=+22.763144997" watchObservedRunningTime="2026-04-21 04:24:30.767844657 +0000 UTC m=+22.763242282" Apr 21 04:24:30.768394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:30.768351 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sl7gg" podStartSLOduration=4.496654623 podStartE2EDuration="22.768342939s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.726799862 +0000 UTC m=+1.722197468" lastFinishedPulling="2026-04-21 04:24:27.998488179 +0000 UTC m=+19.993885784" observedRunningTime="2026-04-21 04:24:29.832588178 +0000 UTC m=+21.827985797" watchObservedRunningTime="2026-04-21 04:24:30.768342939 +0000 UTC m=+22.763740565" Apr 21 04:24:31.508305 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:31.508262 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:31.508505 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:31.508395 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:31.508505 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:31.508413 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:31.508756 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:31.508544 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:31.765930 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:31.765845 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" event={"ID":"ab782842-c406-437d-a62e-012b36593d16","Type":"ContainerStarted","Data":"b9ed97dc4f76784c22f0a14bb82dda1aa7d550f553eae1688c86debbc762c230"} Apr 21 04:24:31.787282 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:31.787225 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ltpgs" podStartSLOduration=2.9212502750000002 podStartE2EDuration="23.787203813s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.819202163 +0000 UTC m=+1.814599766" lastFinishedPulling="2026-04-21 04:24:30.685155689 +0000 UTC m=+22.680553304" observedRunningTime="2026-04-21 04:24:31.786544777 +0000 UTC m=+23.781942401" watchObservedRunningTime="2026-04-21 04:24:31.787203813 +0000 UTC m=+23.782601436" Apr 21 04:24:33.504031 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:33.503780 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:33.504495 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:33.504474 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:33.507719 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:33.507698 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:33.507828 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:33.507733 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:33.507828 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:33.507816 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:33.507959 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:33.507932 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:33.770045 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:33.769958 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:33.770449 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:33.770434 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xktdv" Apr 21 04:24:34.773493 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.773248 2579 generic.go:358] "Generic (PLEG): container finished" podID="861cd341-7a89-4a04-83c3-db9adea07535" containerID="c2f34b20db9f5c6267dc83efd899fde6c32b881b7385190fddc82e28370e3857" exitCode=0 Apr 21 04:24:34.773493 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.773337 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" event={"ID":"861cd341-7a89-4a04-83c3-db9adea07535","Type":"ContainerDied","Data":"c2f34b20db9f5c6267dc83efd899fde6c32b881b7385190fddc82e28370e3857"} Apr 21 04:24:34.776574 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.776559 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:24:34.776897 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.776867 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"9bc6068eb529259dc0a58806c08173a1491e9357b2f35bcd48b8fad4f7e749cc"} Apr 21 04:24:34.777197 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.777177 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:34.777275 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.777205 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:34.777341 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.777328 2579 scope.go:117] "RemoveContainer" containerID="d4b67ac94665e3b6f9f52baa9e214d0f4f259ad5199858895be7ae1f7ee82829" Apr 21 04:24:34.792020 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.791999 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:34.792140 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:34.792069 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:35.508529 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.508495 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:35.508784 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.508495 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:35.508784 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:35.508628 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:35.508784 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:35.508723 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:35.781451 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.781430 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:24:35.781811 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.781785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" event={"ID":"b6a42934-9530-4770-ba35-f71911b5b3b3","Type":"ContainerStarted","Data":"86a98ba6b1db23256a068a0492fbbc5d7a3531ecd977a58e5d9a7a1354cee8c8"} Apr 21 04:24:35.781875 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.781862 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 04:24:35.812304 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.812150 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" podStartSLOduration=9.602350678 podStartE2EDuration="27.812132638s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.785712134 +0000 UTC m=+1.781109738" lastFinishedPulling="2026-04-21 04:24:27.995494091 +0000 UTC m=+19.990891698" observedRunningTime="2026-04-21 04:24:35.810938871 +0000 UTC m=+27.806336497" watchObservedRunningTime="2026-04-21 04:24:35.812132638 +0000 UTC m=+27.807530264" Apr 21 04:24:35.823817 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.823792 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5zmm9"] Apr 21 04:24:35.823929 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.823883 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:35.824003 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:35.823966 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:35.826519 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.826496 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lkbgz"] Apr 21 04:24:35.826611 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:35.826571 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:35.826666 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:35.826648 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:36.245934 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:36.245906 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:24:36.785814 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:36.785782 2579 generic.go:358] "Generic (PLEG): container finished" podID="861cd341-7a89-4a04-83c3-db9adea07535" containerID="cba2ef116851736b5e46c5d2d2e95fccbc0dd6c3714123b7fea3e8ca92aa8ca0" exitCode=0 Apr 21 04:24:36.786261 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:36.785874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" event={"ID":"861cd341-7a89-4a04-83c3-db9adea07535","Type":"ContainerDied","Data":"cba2ef116851736b5e46c5d2d2e95fccbc0dd6c3714123b7fea3e8ca92aa8ca0"} Apr 21 04:24:37.508209 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:37.508176 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:37.508362 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:37.508176 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:37.508362 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:37.508301 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:37.508362 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:37.508340 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:37.790051 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:37.789960 2579 generic.go:358] "Generic (PLEG): container finished" podID="861cd341-7a89-4a04-83c3-db9adea07535" containerID="04ecb7f5dccd60656e970687ff6dc3396a6aa3d2515e91bbe7aea360cf9eb566" exitCode=0 Apr 21 04:24:37.790387 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:37.790046 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" event={"ID":"861cd341-7a89-4a04-83c3-db9adea07535","Type":"ContainerDied","Data":"04ecb7f5dccd60656e970687ff6dc3396a6aa3d2515e91bbe7aea360cf9eb566"} Apr 21 04:24:39.507841 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:39.507661 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:39.508319 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:39.507682 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:39.508319 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:39.507937 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zmm9" podUID="82c4d960-14d9-4cfe-80d2-1d0751997a7d" Apr 21 04:24:39.508319 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:39.508049 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkbgz" podUID="ebfdc6e1-781b-42a1-b442-ba40dfec626c" Apr 21 04:24:40.903300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.903269 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-45.ec2.internal" event="NodeReady" Apr 21 04:24:40.903799 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.903432 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 04:24:40.955333 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.955248 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rnhvz"] Apr 21 04:24:40.980857 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.980827 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p27tc"] Apr 21 04:24:40.981070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.981040 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:40.983492 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.983438 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 04:24:40.983605 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.983541 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 04:24:40.983605 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.983542 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vk74d\"" Apr 21 04:24:40.996538 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.996514 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p27tc"] Apr 21 04:24:40.996640 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.996545 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rnhvz"] Apr 21 04:24:40.996682 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.996636 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:40.999226 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.999203 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 04:24:40.999226 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.999219 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 04:24:40.999423 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.999289 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l78gs\"" Apr 21 04:24:40.999527 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:40.999511 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 04:24:41.036043 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.036005 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk8ll\" (UniqueName: \"kubernetes.io/projected/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-kube-api-access-pk8ll\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.036229 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.036052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-config-volume\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.036229 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.036103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.036229 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.036197 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-tmp-dir\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.136920 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.136885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9kdm\" (UniqueName: \"kubernetes.io/projected/83f15b2a-86b0-4400-a5f1-ff037093ddcc-kube-api-access-g9kdm\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:41.136920 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.136930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-tmp-dir\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.137148 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.136954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:41.137148 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.137124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk8ll\" (UniqueName: \"kubernetes.io/projected/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-kube-api-access-pk8ll\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.137224 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.137154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-config-volume\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.137224 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.137186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.137300 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.137284 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:41.137300 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.137285 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-tmp-dir\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.137376 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.137341 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls podName:12bdc8fa-32f4-4a0d-85b9-2e5c19724e76 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:41.63732443 +0000 UTC m=+33.632722033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls") pod "dns-default-rnhvz" (UID: "12bdc8fa-32f4-4a0d-85b9-2e5c19724e76") : secret "dns-default-metrics-tls" not found Apr 21 04:24:41.137660 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.137643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-config-volume\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.148845 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.148819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk8ll\" (UniqueName: \"kubernetes.io/projected/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-kube-api-access-pk8ll\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.237590 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.237509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9kdm\" (UniqueName: \"kubernetes.io/projected/83f15b2a-86b0-4400-a5f1-ff037093ddcc-kube-api-access-g9kdm\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:41.237590 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.237560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:41.237799 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.237594 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:41.237799 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.237729 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:41.237799 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.237772 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:41.237799 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.237801 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert podName:83f15b2a-86b0-4400-a5f1-ff037093ddcc nodeName:}" failed. No retries permitted until 2026-04-21 04:24:41.737781791 +0000 UTC m=+33.733179396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert") pod "ingress-canary-p27tc" (UID: "83f15b2a-86b0-4400-a5f1-ff037093ddcc") : secret "canary-serving-cert" not found Apr 21 04:24:41.237950 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.237828 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs podName:ebfdc6e1-781b-42a1-b442-ba40dfec626c nodeName:}" failed. No retries permitted until 2026-04-21 04:25:13.237812571 +0000 UTC m=+65.233210179 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs") pod "network-metrics-daemon-lkbgz" (UID: "ebfdc6e1-781b-42a1-b442-ba40dfec626c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:41.248474 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.248449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9kdm\" (UniqueName: \"kubernetes.io/projected/83f15b2a-86b0-4400-a5f1-ff037093ddcc-kube-api-access-g9kdm\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:41.339110 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.339067 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:41.339308 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.339257 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:41.339308 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.339282 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:41.339308 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.339292 2579 projected.go:194] Error preparing data for projected volume kube-api-access-z8txf for pod openshift-network-diagnostics/network-check-target-5zmm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:41.339447 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.339360 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf podName:82c4d960-14d9-4cfe-80d2-1d0751997a7d nodeName:}" failed. No retries permitted until 2026-04-21 04:25:13.33934249 +0000 UTC m=+65.334740117 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8txf" (UniqueName: "kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf") pod "network-check-target-5zmm9" (UID: "82c4d960-14d9-4cfe-80d2-1d0751997a7d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:41.508380 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.508295 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:24:41.508549 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.508301 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:24:41.511431 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.511300 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:24:41.511431 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.511328 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:24:41.511431 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.511303 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:24:41.511431 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.511342 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tdp6m\"" Apr 21 04:24:41.511431 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.511370 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9c9js\"" Apr 21 04:24:41.641737 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.641699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:41.641942 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.641845 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:41.641942 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.641927 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls podName:12bdc8fa-32f4-4a0d-85b9-2e5c19724e76 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:42.641904718 +0000 UTC m=+34.637302325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls") pod "dns-default-rnhvz" (UID: "12bdc8fa-32f4-4a0d-85b9-2e5c19724e76") : secret "dns-default-metrics-tls" not found Apr 21 04:24:41.742790 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:41.742752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:41.742977 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.742917 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:41.743075 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:41.743018 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert podName:83f15b2a-86b0-4400-a5f1-ff037093ddcc nodeName:}" failed. No retries permitted until 2026-04-21 04:24:42.742995246 +0000 UTC m=+34.738392866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert") pod "ingress-canary-p27tc" (UID: "83f15b2a-86b0-4400-a5f1-ff037093ddcc") : secret "canary-serving-cert" not found Apr 21 04:24:42.650931 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:42.650878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:42.651626 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:42.651055 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:42.651626 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:42.651130 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls podName:12bdc8fa-32f4-4a0d-85b9-2e5c19724e76 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:44.651113538 +0000 UTC m=+36.646511141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls") pod "dns-default-rnhvz" (UID: "12bdc8fa-32f4-4a0d-85b9-2e5c19724e76") : secret "dns-default-metrics-tls" not found Apr 21 04:24:42.752061 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:42.752008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:42.752262 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:42.752172 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:42.752262 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:42.752250 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert podName:83f15b2a-86b0-4400-a5f1-ff037093ddcc nodeName:}" failed. No retries permitted until 2026-04-21 04:24:44.752229294 +0000 UTC m=+36.747626897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert") pod "ingress-canary-p27tc" (UID: "83f15b2a-86b0-4400-a5f1-ff037093ddcc") : secret "canary-serving-cert" not found Apr 21 04:24:44.665842 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:44.665803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:44.666316 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:44.665956 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:44.666316 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:44.666053 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls podName:12bdc8fa-32f4-4a0d-85b9-2e5c19724e76 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:48.666034918 +0000 UTC m=+40.661432521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls") pod "dns-default-rnhvz" (UID: "12bdc8fa-32f4-4a0d-85b9-2e5c19724e76") : secret "dns-default-metrics-tls" not found Apr 21 04:24:44.767176 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:44.767131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:44.767348 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:44.767275 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:44.767348 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:44.767339 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert podName:83f15b2a-86b0-4400-a5f1-ff037093ddcc nodeName:}" failed. No retries permitted until 2026-04-21 04:24:48.767323552 +0000 UTC m=+40.762721154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert") pod "ingress-canary-p27tc" (UID: "83f15b2a-86b0-4400-a5f1-ff037093ddcc") : secret "canary-serving-cert" not found Apr 21 04:24:44.807093 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:44.807058 2579 generic.go:358] "Generic (PLEG): container finished" podID="861cd341-7a89-4a04-83c3-db9adea07535" containerID="a691d9d62f7de097f863eba83024294d8c870102dc15dd3835640fde6cdcc5e2" exitCode=0 Apr 21 04:24:44.807093 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:44.807090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" event={"ID":"861cd341-7a89-4a04-83c3-db9adea07535","Type":"ContainerDied","Data":"a691d9d62f7de097f863eba83024294d8c870102dc15dd3835640fde6cdcc5e2"} Apr 21 04:24:45.811856 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:45.811767 2579 generic.go:358] "Generic (PLEG): container finished" podID="861cd341-7a89-4a04-83c3-db9adea07535" containerID="1e95e08582f7a525153ee5822a3d7ac27cae93bbcced5a66527266b0fdaaf1b3" exitCode=0 Apr 21 04:24:45.811856 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:45.811840 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" event={"ID":"861cd341-7a89-4a04-83c3-db9adea07535","Type":"ContainerDied","Data":"1e95e08582f7a525153ee5822a3d7ac27cae93bbcced5a66527266b0fdaaf1b3"} Apr 21 04:24:46.817055 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:46.816862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" event={"ID":"861cd341-7a89-4a04-83c3-db9adea07535","Type":"ContainerStarted","Data":"05185cf76b5b9ef1f2b455664de7d241fe897eb7bd9cc3169ba1f2e6d0d9222a"} Apr 21 04:24:46.839677 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:46.839618 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4dbmj" podStartSLOduration=4.909472495 podStartE2EDuration="38.83960268s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:09.736904758 +0000 UTC m=+1.732302361" lastFinishedPulling="2026-04-21 04:24:43.667034941 +0000 UTC m=+35.662432546" observedRunningTime="2026-04-21 04:24:46.838112402 +0000 UTC m=+38.833510041" watchObservedRunningTime="2026-04-21 04:24:46.83960268 +0000 UTC m=+38.835000305" Apr 21 04:24:48.695645 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:48.695607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:48.696060 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:48.695719 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:48.696060 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:48.695770 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls podName:12bdc8fa-32f4-4a0d-85b9-2e5c19724e76 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:56.695756841 +0000 UTC m=+48.691154444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls") pod "dns-default-rnhvz" (UID: "12bdc8fa-32f4-4a0d-85b9-2e5c19724e76") : secret "dns-default-metrics-tls" not found Apr 21 04:24:48.796176 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:48.796139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:48.796330 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:48.796286 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:48.796369 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:48.796350 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert podName:83f15b2a-86b0-4400-a5f1-ff037093ddcc nodeName:}" failed. No retries permitted until 2026-04-21 04:24:56.796334206 +0000 UTC m=+48.791731809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert") pod "ingress-canary-p27tc" (UID: "83f15b2a-86b0-4400-a5f1-ff037093ddcc") : secret "canary-serving-cert" not found Apr 21 04:24:56.749695 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:56.749650 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:24:56.750241 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:56.749810 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:56.750241 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:56.749874 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls podName:12bdc8fa-32f4-4a0d-85b9-2e5c19724e76 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:12.749859035 +0000 UTC m=+64.745256638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls") pod "dns-default-rnhvz" (UID: "12bdc8fa-32f4-4a0d-85b9-2e5c19724e76") : secret "dns-default-metrics-tls" not found Apr 21 04:24:56.850263 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:24:56.850220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:24:56.850416 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:56.850397 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:56.850526 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:24:56.850515 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert podName:83f15b2a-86b0-4400-a5f1-ff037093ddcc nodeName:}" failed. No retries permitted until 2026-04-21 04:25:12.850498176 +0000 UTC m=+64.845895781 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert") pod "ingress-canary-p27tc" (UID: "83f15b2a-86b0-4400-a5f1-ff037093ddcc") : secret "canary-serving-cert" not found Apr 21 04:25:07.801078 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:07.801049 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqlz8" Apr 21 04:25:12.763039 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:12.763000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:25:12.763458 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:12.763161 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:25:12.763458 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:12.763230 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls podName:12bdc8fa-32f4-4a0d-85b9-2e5c19724e76 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:44.763214878 +0000 UTC m=+96.758612480 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls") pod "dns-default-rnhvz" (UID: "12bdc8fa-32f4-4a0d-85b9-2e5c19724e76") : secret "dns-default-metrics-tls" not found Apr 21 04:25:12.863341 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:12.863310 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:25:12.863523 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:12.863482 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:25:12.863575 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:12.863564 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert podName:83f15b2a-86b0-4400-a5f1-ff037093ddcc nodeName:}" failed. No retries permitted until 2026-04-21 04:25:44.863546399 +0000 UTC m=+96.858944002 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert") pod "ingress-canary-p27tc" (UID: "83f15b2a-86b0-4400-a5f1-ff037093ddcc") : secret "canary-serving-cert" not found Apr 21 04:25:13.266263 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.266233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:25:13.268911 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.268893 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:25:13.276643 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:13.276626 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:25:13.276714 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:13.276704 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs podName:ebfdc6e1-781b-42a1-b442-ba40dfec626c nodeName:}" failed. No retries permitted until 2026-04-21 04:26:17.276684599 +0000 UTC m=+129.272082221 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs") pod "network-metrics-daemon-lkbgz" (UID: "ebfdc6e1-781b-42a1-b442-ba40dfec626c") : secret "metrics-daemon-secret" not found Apr 21 04:25:13.366599 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.366559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:25:13.369088 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.369068 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:25:13.379122 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.379102 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:25:13.390957 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.390924 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8txf\" (UniqueName: \"kubernetes.io/projected/82c4d960-14d9-4cfe-80d2-1d0751997a7d-kube-api-access-z8txf\") pod \"network-check-target-5zmm9\" (UID: \"82c4d960-14d9-4cfe-80d2-1d0751997a7d\") " pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:25:13.623117 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.623086 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tdp6m\"" Apr 21 04:25:13.630933 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.630914 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:25:13.812011 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.811904 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5zmm9"] Apr 21 04:25:13.815775 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:25:13.815734 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c4d960_14d9_4cfe_80d2_1d0751997a7d.slice/crio-a948ef35018510195256616d444cbb37dce15e5f6a00a04de69bc546891a29b8 WatchSource:0}: Error finding container a948ef35018510195256616d444cbb37dce15e5f6a00a04de69bc546891a29b8: Status 404 returned error can't find the container with id a948ef35018510195256616d444cbb37dce15e5f6a00a04de69bc546891a29b8 Apr 21 04:25:13.867458 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:13.867423 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5zmm9" event={"ID":"82c4d960-14d9-4cfe-80d2-1d0751997a7d","Type":"ContainerStarted","Data":"a948ef35018510195256616d444cbb37dce15e5f6a00a04de69bc546891a29b8"} Apr 21 04:25:16.875830 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:16.875792 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5zmm9" event={"ID":"82c4d960-14d9-4cfe-80d2-1d0751997a7d","Type":"ContainerStarted","Data":"5e7624663cb038d6ac07b6c80941bc710584de0bc006bca8ebf23db2ecf30bed"} Apr 21 04:25:16.876291 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:16.875955 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:25:16.889956 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:16.889913 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5zmm9" podStartSLOduration=66.218382376 podStartE2EDuration="1m8.889899085s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:25:13.817506825 +0000 UTC m=+65.812904429" lastFinishedPulling="2026-04-21 04:25:16.489023532 +0000 UTC m=+68.484421138" observedRunningTime="2026-04-21 04:25:16.889326966 +0000 UTC m=+68.884724593" watchObservedRunningTime="2026-04-21 04:25:16.889899085 +0000 UTC m=+68.885296897" Apr 21 04:25:44.768824 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:44.768781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:25:44.769447 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:44.768957 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:25:44.769447 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:44.769062 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls podName:12bdc8fa-32f4-4a0d-85b9-2e5c19724e76 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:48.769038984 +0000 UTC m=+160.764436594 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls") pod "dns-default-rnhvz" (UID: "12bdc8fa-32f4-4a0d-85b9-2e5c19724e76") : secret "dns-default-metrics-tls" not found Apr 21 04:25:44.869968 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:44.869914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:25:44.870169 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:44.870105 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:25:44.870209 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:44.870186 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert podName:83f15b2a-86b0-4400-a5f1-ff037093ddcc nodeName:}" failed. No retries permitted until 2026-04-21 04:26:48.870168333 +0000 UTC m=+160.865565935 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert") pod "ingress-canary-p27tc" (UID: "83f15b2a-86b0-4400-a5f1-ff037093ddcc") : secret "canary-serving-cert" not found Apr 21 04:25:47.879507 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:47.879470 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5zmm9" Apr 21 04:25:56.304837 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.304802 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5"] Apr 21 04:25:56.307404 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.307376 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.307757 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.307728 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-f8d5bbc98-2ffvt"] Apr 21 04:25:56.309670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.309650 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 04:25:56.310383 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.310368 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.310976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.310953 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 04:25:56.311109 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.311093 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-z2886\"" Apr 21 04:25:56.311161 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.311147 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 04:25:56.311341 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.311326 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 04:25:56.312657 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.312510 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 04:25:56.312657 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.312514 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 04:25:56.312657 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.312516 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 04:25:56.312657 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.312564 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9lrfv\"" Apr 21 04:25:56.312657 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.312581 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 04:25:56.312959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.312915 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 04:25:56.313149 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.313133 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 04:25:56.315592 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.315555 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5"] Apr 21 04:25:56.322776 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.322697 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-f8d5bbc98-2ffvt"] Apr 21 04:25:56.347498 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.347474 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.347498 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.347501 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-stats-auth\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.347720 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.347523 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-default-certificate\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.347720 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.347551 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7gb6\" (UniqueName: \"kubernetes.io/projected/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-kube-api-access-r7gb6\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.347720 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.347647 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.347872 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.347684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znnks\" (UniqueName: \"kubernetes.io/projected/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-kube-api-access-znnks\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.347872 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.347779 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.347872 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.347833 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.416841 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.416806 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-f96bf"] Apr 21 04:25:56.419530 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.419507 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gl6cc"] Apr 21 04:25:56.419736 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.419716 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.422178 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.422161 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.422341 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.422310 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 04:25:56.422910 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.422891 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 04:25:56.423018 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.422963 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-vxs29\"" Apr 21 04:25:56.423018 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.422999 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 04:25:56.423142 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.423000 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 04:25:56.424376 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.424359 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 04:25:56.424968 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.424948 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-cbt4d\"" Apr 21 04:25:56.425085 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.424974 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:25:56.425085 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.424951 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 04:25:56.425401 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.425343 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 04:25:56.428452 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.428436 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 04:25:56.430445 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.430426 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 04:25:56.430873 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.430854 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gl6cc"] Apr 21 04:25:56.431853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.431833 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-f96bf"] Apr 21 04:25:56.449063 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449037 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.449171 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449082 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znnks\" (UniqueName: \"kubernetes.io/projected/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-kube-api-access-znnks\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.449171 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449115 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-serving-cert\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.449171 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.449143 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:25:56.449171 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.449320 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449186 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-tmp\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.449320 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.449213 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls podName:0bfc0f36-e3dc-4add-97d2-eef31c8c896d nodeName:}" failed. No retries permitted until 2026-04-21 04:25:56.949193371 +0000 UTC m=+108.944590982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qbhw5" (UID: "0bfc0f36-e3dc-4add-97d2-eef31c8c896d") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:25:56.449320 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.449481 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.449384 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:25:56.449481 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-snapshots\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.449481 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.449441 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:56.949423638 +0000 UTC m=+108.944821247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : secret "router-metrics-certs-default" not found Apr 21 04:25:56.449598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-service-ca-bundle\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.449598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f1153f0-d097-4947-a74c-4824967f6184-config\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.449598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f1153f0-d097-4947-a74c-4824967f6184-trusted-ca\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.449598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.449786 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-stats-auth\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.449786 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnfrf\" (UniqueName: \"kubernetes.io/projected/0f1153f0-d097-4947-a74c-4824967f6184-kube-api-access-lnfrf\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.449786 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449659 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f1153f0-d097-4947-a74c-4824967f6184-serving-cert\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.449786 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-default-certificate\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.449786 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.449786 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.449727 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:56.949713276 +0000 UTC m=+108.945110880 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : configmap references non-existent config key: service-ca.crt Apr 21 04:25:56.449786 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449777 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7gb6\" (UniqueName: \"kubernetes.io/projected/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-kube-api-access-r7gb6\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.450112 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449817 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfs5d\" (UniqueName: \"kubernetes.io/projected/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-kube-api-access-dfs5d\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.450112 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.449965 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.452018 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.451998 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-default-certificate\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.452107 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.452099 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-stats-auth\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.458036 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.458013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znnks\" (UniqueName: \"kubernetes.io/projected/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-kube-api-access-znnks\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.458165 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.458150 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7gb6\" (UniqueName: \"kubernetes.io/projected/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-kube-api-access-r7gb6\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.550990 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.550953 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-snapshots\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.551166 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-service-ca-bundle\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.551166 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551041 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f1153f0-d097-4947-a74c-4824967f6184-config\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.551166 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551057 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f1153f0-d097-4947-a74c-4824967f6184-trusted-ca\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.551166 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551097 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnfrf\" (UniqueName: \"kubernetes.io/projected/0f1153f0-d097-4947-a74c-4824967f6184-kube-api-access-lnfrf\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.551166 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f1153f0-d097-4947-a74c-4824967f6184-serving-cert\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.551166 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.551429 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfs5d\" (UniqueName: \"kubernetes.io/projected/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-kube-api-access-dfs5d\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.551429 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-serving-cert\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.551429 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-tmp\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.551887 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551856 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-service-ca-bundle\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.551887 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.551868 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f1153f0-d097-4947-a74c-4824967f6184-config\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.552109 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.552098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-tmp\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.552197 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.552178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-snapshots\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.552255 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.552229 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f1153f0-d097-4947-a74c-4824967f6184-trusted-ca\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.552941 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.552917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.554119 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.554100 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f1153f0-d097-4947-a74c-4824967f6184-serving-cert\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.554280 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.554262 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-serving-cert\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.559043 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.558959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfs5d\" (UniqueName: \"kubernetes.io/projected/99b02ee4-b901-4a4f-9b7a-2ddd30492de6-kube-api-access-dfs5d\") pod \"insights-operator-585dfdc468-f96bf\" (UID: \"99b02ee4-b901-4a4f-9b7a-2ddd30492de6\") " pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.559120 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.559081 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnfrf\" (UniqueName: \"kubernetes.io/projected/0f1153f0-d097-4947-a74c-4824967f6184-kube-api-access-lnfrf\") pod \"console-operator-9d4b6777b-gl6cc\" (UID: \"0f1153f0-d097-4947-a74c-4824967f6184\") " pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.731215 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.731185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-f96bf" Apr 21 04:25:56.735955 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.735925 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:25:56.870759 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.870726 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-f96bf"] Apr 21 04:25:56.873897 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:25:56.873851 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b02ee4_b901_4a4f_9b7a_2ddd30492de6.slice/crio-118e5fb7fb24c75054170bd84e76b4253d62cb13204620e795c77b5581090c62 WatchSource:0}: Error finding container 118e5fb7fb24c75054170bd84e76b4253d62cb13204620e795c77b5581090c62: Status 404 returned error can't find the container with id 118e5fb7fb24c75054170bd84e76b4253d62cb13204620e795c77b5581090c62 Apr 21 04:25:56.887060 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.887032 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gl6cc"] Apr 21 04:25:56.890059 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:25:56.890035 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f1153f0_d097_4947_a74c_4824967f6184.slice/crio-40f626b9369d0db71c52bba0690cea925e3ed19d5a02b83b95fc141ed64c9942 WatchSource:0}: Error finding container 40f626b9369d0db71c52bba0690cea925e3ed19d5a02b83b95fc141ed64c9942: Status 404 returned error can't find the container with id 40f626b9369d0db71c52bba0690cea925e3ed19d5a02b83b95fc141ed64c9942 Apr 21 04:25:56.950313 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.950281 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" event={"ID":"0f1153f0-d097-4947-a74c-4824967f6184","Type":"ContainerStarted","Data":"40f626b9369d0db71c52bba0690cea925e3ed19d5a02b83b95fc141ed64c9942"} Apr 21 04:25:56.951210 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.951187 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-f96bf" event={"ID":"99b02ee4-b901-4a4f-9b7a-2ddd30492de6","Type":"ContainerStarted","Data":"118e5fb7fb24c75054170bd84e76b4253d62cb13204620e795c77b5581090c62"} Apr 21 04:25:56.955541 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.955524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:56.955608 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.955574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.955649 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:56.955612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:56.955689 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.955661 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:25:56.955723 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.955714 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:25:56.955762 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.955716 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls podName:0bfc0f36-e3dc-4add-97d2-eef31c8c896d nodeName:}" failed. No retries permitted until 2026-04-21 04:25:57.955701566 +0000 UTC m=+109.951099168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qbhw5" (UID: "0bfc0f36-e3dc-4add-97d2-eef31c8c896d") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:25:56.955805 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.955769 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:57.955756612 +0000 UTC m=+109.951154214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : secret "router-metrics-certs-default" not found Apr 21 04:25:56.955805 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:56.955781 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:57.955775105 +0000 UTC m=+109.951172708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : configmap references non-existent config key: service-ca.crt Apr 21 04:25:57.964938 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:57.964878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:57.965419 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:57.964994 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:57.965419 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:57.965054 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:57.965419 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:57.965074 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:25:57.965419 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:57.965139 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:25:57.965419 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:57.965150 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls podName:0bfc0f36-e3dc-4add-97d2-eef31c8c896d nodeName:}" failed. No retries permitted until 2026-04-21 04:25:59.96512902 +0000 UTC m=+111.960526628 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qbhw5" (UID: "0bfc0f36-e3dc-4add-97d2-eef31c8c896d") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:25:57.965419 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:57.965168 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:59.965159275 +0000 UTC m=+111.960556885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : configmap references non-existent config key: service-ca.crt Apr 21 04:25:57.965419 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:57.965185 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:59.965175857 +0000 UTC m=+111.960573461 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : secret "router-metrics-certs-default" not found Apr 21 04:25:59.959710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.959681 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/0.log" Apr 21 04:25:59.960151 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.959720 2579 generic.go:358] "Generic (PLEG): container finished" podID="0f1153f0-d097-4947-a74c-4824967f6184" containerID="e72b7aeee2ad193ba5cb67bab24bd7073d5861fdc27e673c4f2d4f1a690355fa" exitCode=255 Apr 21 04:25:59.960151 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.959777 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" event={"ID":"0f1153f0-d097-4947-a74c-4824967f6184","Type":"ContainerDied","Data":"e72b7aeee2ad193ba5cb67bab24bd7073d5861fdc27e673c4f2d4f1a690355fa"} Apr 21 04:25:59.960151 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.960072 2579 scope.go:117] "RemoveContainer" containerID="e72b7aeee2ad193ba5cb67bab24bd7073d5861fdc27e673c4f2d4f1a690355fa" Apr 21 04:25:59.961174 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.961141 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-f96bf" event={"ID":"99b02ee4-b901-4a4f-9b7a-2ddd30492de6","Type":"ContainerStarted","Data":"e90fe192c78fbf9729342c994c38d0605742b699722ea36a5754876421d583da"} Apr 21 04:25:59.981712 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.981687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:59.981808 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.981734 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:25:59.981844 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:59.981819 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:25:59.981886 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:59.981862 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls podName:0bfc0f36-e3dc-4add-97d2-eef31c8c896d nodeName:}" failed. No retries permitted until 2026-04-21 04:26:03.981849096 +0000 UTC m=+115.977246699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qbhw5" (UID: "0bfc0f36-e3dc-4add-97d2-eef31c8c896d") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:25:59.981886 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:59.981874 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:25:59.981966 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:59.981919 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:03.981908334 +0000 UTC m=+115.977305940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : secret "router-metrics-certs-default" not found Apr 21 04:25:59.981966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.981819 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:25:59.981966 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:25:59.981932 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:03.981926186 +0000 UTC m=+115.977323789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : configmap references non-existent config key: service-ca.crt Apr 21 04:25:59.986657 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:25:59.986622 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-f96bf" podStartSLOduration=1.6533252 podStartE2EDuration="3.986610428s" podCreationTimestamp="2026-04-21 04:25:56 +0000 UTC" firstStartedPulling="2026-04-21 04:25:56.875637187 +0000 UTC m=+108.871034789" lastFinishedPulling="2026-04-21 04:25:59.208922411 +0000 UTC m=+111.204320017" observedRunningTime="2026-04-21 04:25:59.985923151 +0000 UTC m=+111.981320782" watchObservedRunningTime="2026-04-21 04:25:59.986610428 +0000 UTC m=+111.982008052" Apr 21 04:26:00.967234 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:00.967203 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/1.log" Apr 21 04:26:00.967626 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:00.967572 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/0.log" Apr 21 04:26:00.967626 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:00.967602 2579 generic.go:358] "Generic (PLEG): container finished" podID="0f1153f0-d097-4947-a74c-4824967f6184" containerID="efc1b53c4a8b29a5353a6ea5e9631cc8542ed49f02cbdbaf1b2cc7b432ae9c34" exitCode=255 Apr 21 04:26:00.967725 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:00.967697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" event={"ID":"0f1153f0-d097-4947-a74c-4824967f6184","Type":"ContainerDied","Data":"efc1b53c4a8b29a5353a6ea5e9631cc8542ed49f02cbdbaf1b2cc7b432ae9c34"} Apr 21 04:26:00.967773 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:00.967748 2579 scope.go:117] "RemoveContainer" containerID="e72b7aeee2ad193ba5cb67bab24bd7073d5861fdc27e673c4f2d4f1a690355fa" Apr 21 04:26:00.967915 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:00.967895 2579 scope.go:117] "RemoveContainer" containerID="efc1b53c4a8b29a5353a6ea5e9631cc8542ed49f02cbdbaf1b2cc7b432ae9c34" Apr 21 04:26:00.968149 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:00.968131 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gl6cc_openshift-console-operator(0f1153f0-d097-4947-a74c-4824967f6184)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" podUID="0f1153f0-d097-4947-a74c-4824967f6184" Apr 21 04:26:01.764974 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:01.764945 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h5xvg_ee93154d-8192-4132-b610-52bffab2fc10/dns-node-resolver/0.log" Apr 21 04:26:01.971445 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:01.971419 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/1.log" Apr 21 04:26:01.971851 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:01.971804 2579 scope.go:117] "RemoveContainer" containerID="efc1b53c4a8b29a5353a6ea5e9631cc8542ed49f02cbdbaf1b2cc7b432ae9c34" Apr 21 04:26:01.972055 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:01.972032 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gl6cc_openshift-console-operator(0f1153f0-d097-4947-a74c-4824967f6184)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" podUID="0f1153f0-d097-4947-a74c-4824967f6184" Apr 21 04:26:02.361926 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:02.361894 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cp8gk_f5fd5747-67eb-4782-b7d2-6b81f0d51528/node-ca/0.log" Apr 21 04:26:03.542223 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.542187 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vq4nf"] Apr 21 04:26:03.546530 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.546505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.548679 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.548648 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 04:26:03.548817 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.548771 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 04:26:03.548898 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.548853 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 04:26:03.549642 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.549626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jspr6\"" Apr 21 04:26:03.549714 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.549675 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 04:26:03.551952 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.551933 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vq4nf"] Apr 21 04:26:03.612832 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.612793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/687981a2-624b-4470-a2f3-ece60870fe11-signing-key\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.612832 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.612834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7b7p\" (UniqueName: \"kubernetes.io/projected/687981a2-624b-4470-a2f3-ece60870fe11-kube-api-access-w7b7p\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.613115 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.612940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/687981a2-624b-4470-a2f3-ece60870fe11-signing-cabundle\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.714161 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.714122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/687981a2-624b-4470-a2f3-ece60870fe11-signing-key\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.714161 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.714166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7b7p\" (UniqueName: \"kubernetes.io/projected/687981a2-624b-4470-a2f3-ece60870fe11-kube-api-access-w7b7p\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.714408 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.714215 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/687981a2-624b-4470-a2f3-ece60870fe11-signing-cabundle\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.715679 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.715657 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/687981a2-624b-4470-a2f3-ece60870fe11-signing-cabundle\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.716436 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.716421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/687981a2-624b-4470-a2f3-ece60870fe11-signing-key\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.724556 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.724533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7b7p\" (UniqueName: \"kubernetes.io/projected/687981a2-624b-4470-a2f3-ece60870fe11-kube-api-access-w7b7p\") pod \"service-ca-865cb79987-vq4nf\" (UID: \"687981a2-624b-4470-a2f3-ece60870fe11\") " pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.856320 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.856284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vq4nf" Apr 21 04:26:03.972258 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:03.972225 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vq4nf"] Apr 21 04:26:03.975909 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:03.975884 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687981a2_624b_4470_a2f3_ece60870fe11.slice/crio-c1bdc329066c190b125629f8d77ae773bdb19a7730f7e6a7037dd8f607534c13 WatchSource:0}: Error finding container c1bdc329066c190b125629f8d77ae773bdb19a7730f7e6a7037dd8f607534c13: Status 404 returned error can't find the container with id c1bdc329066c190b125629f8d77ae773bdb19a7730f7e6a7037dd8f607534c13 Apr 21 04:26:04.016344 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:04.016303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:04.016537 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:04.016360 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:26:04.016537 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:04.016411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:04.016537 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:04.016461 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:12.01644314 +0000 UTC m=+124.011840743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : configmap references non-existent config key: service-ca.crt Apr 21 04:26:04.016537 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:04.016504 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:26:04.016537 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:04.016511 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:26:04.016726 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:04.016548 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:12.016534583 +0000 UTC m=+124.011932187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : secret "router-metrics-certs-default" not found Apr 21 04:26:04.016726 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:04.016563 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls podName:0bfc0f36-e3dc-4add-97d2-eef31c8c896d nodeName:}" failed. No retries permitted until 2026-04-21 04:26:12.016556989 +0000 UTC m=+124.011954592 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qbhw5" (UID: "0bfc0f36-e3dc-4add-97d2-eef31c8c896d") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:26:04.978862 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:04.978827 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vq4nf" event={"ID":"687981a2-624b-4470-a2f3-ece60870fe11","Type":"ContainerStarted","Data":"c1bdc329066c190b125629f8d77ae773bdb19a7730f7e6a7037dd8f607534c13"} Apr 21 04:26:05.981797 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:05.981756 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vq4nf" event={"ID":"687981a2-624b-4470-a2f3-ece60870fe11","Type":"ContainerStarted","Data":"43bfa34ebcb257ee81343c16a735235ca744441f875938a1819009d05f0acb25"} Apr 21 04:26:05.997244 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:05.997201 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-vq4nf" podStartSLOduration=1.4406229480000001 podStartE2EDuration="2.997186814s" podCreationTimestamp="2026-04-21 04:26:03 +0000 UTC" firstStartedPulling="2026-04-21 04:26:03.978117581 +0000 UTC m=+115.973515185" lastFinishedPulling="2026-04-21 04:26:05.534681444 +0000 UTC m=+117.530079051" observedRunningTime="2026-04-21 04:26:05.996549487 +0000 UTC m=+117.991947113" watchObservedRunningTime="2026-04-21 04:26:05.997186814 +0000 UTC m=+117.992584438" Apr 21 04:26:06.736361 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:06.736316 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:26:06.736361 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:06.736366 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:26:06.736745 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:06.736733 2579 scope.go:117] "RemoveContainer" containerID="efc1b53c4a8b29a5353a6ea5e9631cc8542ed49f02cbdbaf1b2cc7b432ae9c34" Apr 21 04:26:06.736918 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:06.736900 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gl6cc_openshift-console-operator(0f1153f0-d097-4947-a74c-4824967f6184)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" podUID="0f1153f0-d097-4947-a74c-4824967f6184" Apr 21 04:26:12.082710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:12.082672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:26:12.083177 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:12.082767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:12.083177 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:12.082828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:12.083177 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:12.082863 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:26:12.083177 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:12.082884 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:26:12.083177 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:12.082949 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls podName:0bfc0f36-e3dc-4add-97d2-eef31c8c896d nodeName:}" failed. No retries permitted until 2026-04-21 04:26:28.082927993 +0000 UTC m=+140.078325595 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qbhw5" (UID: "0bfc0f36-e3dc-4add-97d2-eef31c8c896d") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:26:12.083177 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:12.082967 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:28.082958476 +0000 UTC m=+140.078356079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : secret "router-metrics-certs-default" not found Apr 21 04:26:12.083177 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:12.083018 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle podName:e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:28.082999994 +0000 UTC m=+140.078397614 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle") pod "router-default-f8d5bbc98-2ffvt" (UID: "e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5") : configmap references non-existent config key: service-ca.crt Apr 21 04:26:17.325783 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:17.325740 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:26:17.328361 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:17.328330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebfdc6e1-781b-42a1-b442-ba40dfec626c-metrics-certs\") pod \"network-metrics-daemon-lkbgz\" (UID: \"ebfdc6e1-781b-42a1-b442-ba40dfec626c\") " pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:26:17.528281 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:17.528250 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9c9js\"" Apr 21 04:26:17.536188 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:17.536165 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkbgz" Apr 21 04:26:17.655409 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:17.655375 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lkbgz"] Apr 21 04:26:17.658090 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:17.658059 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebfdc6e1_781b_42a1_b442_ba40dfec626c.slice/crio-b70a0ef5eb9b8f767804c3ac67b210946329b9586b91fff70b390a2236e4d13f WatchSource:0}: Error finding container b70a0ef5eb9b8f767804c3ac67b210946329b9586b91fff70b390a2236e4d13f: Status 404 returned error can't find the container with id b70a0ef5eb9b8f767804c3ac67b210946329b9586b91fff70b390a2236e4d13f Apr 21 04:26:18.006286 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:18.006252 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lkbgz" event={"ID":"ebfdc6e1-781b-42a1-b442-ba40dfec626c","Type":"ContainerStarted","Data":"b70a0ef5eb9b8f767804c3ac67b210946329b9586b91fff70b390a2236e4d13f"} Apr 21 04:26:19.011730 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:19.011685 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lkbgz" event={"ID":"ebfdc6e1-781b-42a1-b442-ba40dfec626c","Type":"ContainerStarted","Data":"014ede4db359251c302fdd13be7bb8f877c429295aeee1fa9d64da079c478c33"} Apr 21 04:26:20.015958 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:20.015929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lkbgz" event={"ID":"ebfdc6e1-781b-42a1-b442-ba40dfec626c","Type":"ContainerStarted","Data":"6214f05fab87aa488804907c4459a5a6d1a1dec254481059f22d6c84c00a8f90"} Apr 21 04:26:20.032236 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:20.032189 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lkbgz" podStartSLOduration=130.860169369 podStartE2EDuration="2m12.032173509s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:26:17.659680773 +0000 UTC m=+129.655078375" lastFinishedPulling="2026-04-21 04:26:18.831684899 +0000 UTC m=+130.827082515" observedRunningTime="2026-04-21 04:26:20.030661674 +0000 UTC m=+132.026059300" watchObservedRunningTime="2026-04-21 04:26:20.032173509 +0000 UTC m=+132.027571133" Apr 21 04:26:20.508900 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:20.508869 2579 scope.go:117] "RemoveContainer" containerID="efc1b53c4a8b29a5353a6ea5e9631cc8542ed49f02cbdbaf1b2cc7b432ae9c34" Apr 21 04:26:21.019742 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:21.019712 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:26:21.020165 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:21.020064 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/1.log" Apr 21 04:26:21.020165 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:21.020095 2579 generic.go:358] "Generic (PLEG): container finished" podID="0f1153f0-d097-4947-a74c-4824967f6184" containerID="b7559fadd1294a659df5e5037679e169853176873fbc34d6b9cddea55a7abb8f" exitCode=255 Apr 21 04:26:21.020268 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:21.020174 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" event={"ID":"0f1153f0-d097-4947-a74c-4824967f6184","Type":"ContainerDied","Data":"b7559fadd1294a659df5e5037679e169853176873fbc34d6b9cddea55a7abb8f"} Apr 21 04:26:21.020268 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:21.020218 2579 scope.go:117] "RemoveContainer" containerID="efc1b53c4a8b29a5353a6ea5e9631cc8542ed49f02cbdbaf1b2cc7b432ae9c34" Apr 21 04:26:21.020671 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:21.020653 2579 scope.go:117] "RemoveContainer" containerID="b7559fadd1294a659df5e5037679e169853176873fbc34d6b9cddea55a7abb8f" Apr 21 04:26:21.020848 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:21.020823 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gl6cc_openshift-console-operator(0f1153f0-d097-4947-a74c-4824967f6184)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" podUID="0f1153f0-d097-4947-a74c-4824967f6184" Apr 21 04:26:22.024734 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:22.024671 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:26:23.428614 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.428567 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mmwxg"] Apr 21 04:26:23.441964 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.441935 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.442898 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.442873 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mmwxg"] Apr 21 04:26:23.444441 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.444419 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qm4pr\"" Apr 21 04:26:23.444555 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.444444 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:26:23.445162 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.445144 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:26:23.455465 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.455446 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fd975c7bf-tx9ck"] Apr 21 04:26:23.482097 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.482075 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fd975c7bf-tx9ck"] Apr 21 04:26:23.482217 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.482180 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.484519 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.484495 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v6cxl\"" Apr 21 04:26:23.484519 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.484517 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 04:26:23.484686 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.484520 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 04:26:23.484686 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.484637 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 04:26:23.491320 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.491302 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 04:26:23.575973 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.575936 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-data-volume\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.575973 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.575973 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-image-registry-private-configuration\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.576194 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.576194 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576026 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2bq\" (UniqueName: \"kubernetes.io/projected/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-kube-api-access-5s2bq\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.576194 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-bound-sa-token\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.576194 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-installation-pull-secrets\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.576194 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bscj5\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-kube-api-access-bscj5\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.576347 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576201 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-trusted-ca\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.576347 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.576347 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-registry-tls\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.576347 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-crio-socket\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.576347 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576339 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-ca-trust-extracted\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.576499 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.576370 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-registry-certificates\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.676758 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.676723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-data-volume\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.676758 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.676758 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-image-registry-private-configuration\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.676966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.676778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.677070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2bq\" (UniqueName: \"kubernetes.io/projected/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-kube-api-access-5s2bq\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.677124 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-bound-sa-token\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.677179 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-installation-pull-secrets\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.677179 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bscj5\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-kube-api-access-bscj5\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.677263 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-data-volume\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.677263 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-trusted-ca\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.677263 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.677461 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-registry-tls\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.677461 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-crio-socket\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.677461 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-ca-trust-extracted\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.677461 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677446 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-registry-certificates\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.677654 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677548 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-crio-socket\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.677708 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.677693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.678100 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.678034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-ca-trust-extracted\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.678370 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.678342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-registry-certificates\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.678895 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.678843 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-trusted-ca\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.679751 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.679731 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-image-registry-private-configuration\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.679833 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.679795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.679833 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.679813 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-installation-pull-secrets\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.680105 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.680086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-registry-tls\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.685635 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.685611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bscj5\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-kube-api-access-bscj5\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.689021 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.688999 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b40fb2d-b3c9-4470-9ef8-62415d18d8f3-bound-sa-token\") pod \"image-registry-7fd975c7bf-tx9ck\" (UID: \"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3\") " pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.689632 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.689613 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2bq\" (UniqueName: \"kubernetes.io/projected/b78187ed-6b4c-4aa8-8ae4-326e14f2342e-kube-api-access-5s2bq\") pod \"insights-runtime-extractor-mmwxg\" (UID: \"b78187ed-6b4c-4aa8-8ae4-326e14f2342e\") " pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.751230 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.751196 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mmwxg" Apr 21 04:26:23.791305 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.791272 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:23.874296 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.874242 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mmwxg"] Apr 21 04:26:23.878768 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:23.878735 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78187ed_6b4c_4aa8_8ae4_326e14f2342e.slice/crio-9aec78870ce245e92702684705aaa0d63d8c98d1f82e71ef99a1b45e86cada77 WatchSource:0}: Error finding container 9aec78870ce245e92702684705aaa0d63d8c98d1f82e71ef99a1b45e86cada77: Status 404 returned error can't find the container with id 9aec78870ce245e92702684705aaa0d63d8c98d1f82e71ef99a1b45e86cada77 Apr 21 04:26:23.926433 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:23.926398 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fd975c7bf-tx9ck"] Apr 21 04:26:23.930022 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:23.929993 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b40fb2d_b3c9_4470_9ef8_62415d18d8f3.slice/crio-6bb650e46182cedf2f084e5881234bf37c3f6e7f4b268f44df414de01611e935 WatchSource:0}: Error finding container 6bb650e46182cedf2f084e5881234bf37c3f6e7f4b268f44df414de01611e935: Status 404 returned error can't find the container with id 6bb650e46182cedf2f084e5881234bf37c3f6e7f4b268f44df414de01611e935 Apr 21 04:26:24.030343 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:24.030316 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mmwxg" event={"ID":"b78187ed-6b4c-4aa8-8ae4-326e14f2342e","Type":"ContainerStarted","Data":"b7105c493c8fd99cd983f3ffb65eaf0075658868ffb5e22e1641d1f1e794cc27"} Apr 21 04:26:24.030460 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:24.030353 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mmwxg" event={"ID":"b78187ed-6b4c-4aa8-8ae4-326e14f2342e","Type":"ContainerStarted","Data":"9aec78870ce245e92702684705aaa0d63d8c98d1f82e71ef99a1b45e86cada77"} Apr 21 04:26:24.031161 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:24.031134 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" event={"ID":"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3","Type":"ContainerStarted","Data":"6bb650e46182cedf2f084e5881234bf37c3f6e7f4b268f44df414de01611e935"} Apr 21 04:26:25.034925 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:25.034834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mmwxg" event={"ID":"b78187ed-6b4c-4aa8-8ae4-326e14f2342e","Type":"ContainerStarted","Data":"a2820382cc1ccfc130a2d393b4884fc90588620fe66712223e2c884b0104e66e"} Apr 21 04:26:25.035951 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:25.035930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" event={"ID":"8b40fb2d-b3c9-4470-9ef8-62415d18d8f3","Type":"ContainerStarted","Data":"55987adb03bc96010b1601b1a19d459041e3338f7b8f5cbe6604aa33788db935"} Apr 21 04:26:25.036105 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:25.036093 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:25.055029 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:25.054964 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" podStartSLOduration=2.05495237 podStartE2EDuration="2.05495237s" podCreationTimestamp="2026-04-21 04:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:26:25.053597482 +0000 UTC m=+137.048995106" watchObservedRunningTime="2026-04-21 04:26:25.05495237 +0000 UTC m=+137.050349995" Apr 21 04:26:26.736769 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:26.736733 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:26:26.736769 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:26.736767 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:26:26.737196 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:26.737157 2579 scope.go:117] "RemoveContainer" containerID="b7559fadd1294a659df5e5037679e169853176873fbc34d6b9cddea55a7abb8f" Apr 21 04:26:26.737359 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:26.737341 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gl6cc_openshift-console-operator(0f1153f0-d097-4947-a74c-4824967f6184)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" podUID="0f1153f0-d097-4947-a74c-4824967f6184" Apr 21 04:26:27.044489 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:27.044404 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mmwxg" event={"ID":"b78187ed-6b4c-4aa8-8ae4-326e14f2342e","Type":"ContainerStarted","Data":"74e5a03a38b60328b90f1571b99dc22dfa4bd222bdb71ff3a1e116cd1ae263d5"} Apr 21 04:26:27.063522 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:27.063463 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mmwxg" podStartSLOduration=1.877491435 podStartE2EDuration="4.063444195s" podCreationTimestamp="2026-04-21 04:26:23 +0000 UTC" firstStartedPulling="2026-04-21 04:26:24.022697072 +0000 UTC m=+136.018094682" lastFinishedPulling="2026-04-21 04:26:26.208649838 +0000 UTC m=+138.204047442" observedRunningTime="2026-04-21 04:26:27.061777222 +0000 UTC m=+139.057174848" watchObservedRunningTime="2026-04-21 04:26:27.063444195 +0000 UTC m=+139.058841822" Apr 21 04:26:28.117872 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.117831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:28.118355 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.117911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:26:28.118355 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.117961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:28.121172 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.121078 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-metrics-certs\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:28.121886 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.121861 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5-service-ca-bundle\") pod \"router-default-f8d5bbc98-2ffvt\" (UID: \"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5\") " pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:28.123465 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.123444 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bfc0f36-e3dc-4add-97d2-eef31c8c896d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qbhw5\" (UID: \"0bfc0f36-e3dc-4add-97d2-eef31c8c896d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:26:28.127654 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.127635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9lrfv\"" Apr 21 04:26:28.136405 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.136386 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:28.253354 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.253322 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-f8d5bbc98-2ffvt"] Apr 21 04:26:28.256851 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:28.256822 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode407ea03_e7e2_4851_9eb4_f4d38ed0e3b5.slice/crio-39d5dba809854f3ba8bd75be10ace9dab9dce1bbfe0c47ba29ad3b780902f2ca WatchSource:0}: Error finding container 39d5dba809854f3ba8bd75be10ace9dab9dce1bbfe0c47ba29ad3b780902f2ca: Status 404 returned error can't find the container with id 39d5dba809854f3ba8bd75be10ace9dab9dce1bbfe0c47ba29ad3b780902f2ca Apr 21 04:26:28.421265 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.421179 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-z2886\"" Apr 21 04:26:28.429116 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.429096 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" Apr 21 04:26:28.562704 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:28.562682 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5"] Apr 21 04:26:28.564572 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:28.564547 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bfc0f36_e3dc_4add_97d2_eef31c8c896d.slice/crio-7fbc801c0c1998cc7fec6e145980d913fc78c20908574a97efa3c917ed336bb3 WatchSource:0}: Error finding container 7fbc801c0c1998cc7fec6e145980d913fc78c20908574a97efa3c917ed336bb3: Status 404 returned error can't find the container with id 7fbc801c0c1998cc7fec6e145980d913fc78c20908574a97efa3c917ed336bb3 Apr 21 04:26:29.050912 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:29.050878 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" event={"ID":"0bfc0f36-e3dc-4add-97d2-eef31c8c896d","Type":"ContainerStarted","Data":"7fbc801c0c1998cc7fec6e145980d913fc78c20908574a97efa3c917ed336bb3"} Apr 21 04:26:29.052150 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:29.052114 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" event={"ID":"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5","Type":"ContainerStarted","Data":"f37f298abd5dba9885f606f7e790d499a246f30276c5e0612acf8e472e234af1"} Apr 21 04:26:29.052150 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:29.052150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" event={"ID":"e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5","Type":"ContainerStarted","Data":"39d5dba809854f3ba8bd75be10ace9dab9dce1bbfe0c47ba29ad3b780902f2ca"} Apr 21 04:26:29.071631 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:29.071579 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" podStartSLOduration=33.071564377 podStartE2EDuration="33.071564377s" podCreationTimestamp="2026-04-21 04:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:26:29.070398153 +0000 UTC m=+141.065795777" watchObservedRunningTime="2026-04-21 04:26:29.071564377 +0000 UTC m=+141.066962003" Apr 21 04:26:29.137087 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:29.137056 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:29.139615 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:29.139593 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:30.054700 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:30.054669 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:30.056200 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:30.056177 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-f8d5bbc98-2ffvt" Apr 21 04:26:31.058324 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:31.058280 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" event={"ID":"0bfc0f36-e3dc-4add-97d2-eef31c8c896d","Type":"ContainerStarted","Data":"19a03d2a85047e56f9b4538b0e152616227753a078b1d69d30cd41bd76cd1b55"} Apr 21 04:26:31.075200 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:31.075149 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qbhw5" podStartSLOduration=33.253079068 podStartE2EDuration="35.075134756s" podCreationTimestamp="2026-04-21 04:25:56 +0000 UTC" firstStartedPulling="2026-04-21 04:26:28.566343563 +0000 UTC m=+140.561741169" lastFinishedPulling="2026-04-21 04:26:30.388399251 +0000 UTC m=+142.383796857" observedRunningTime="2026-04-21 04:26:31.07416041 +0000 UTC m=+143.069558036" watchObservedRunningTime="2026-04-21 04:26:31.075134756 +0000 UTC m=+143.070532383" Apr 21 04:26:38.275309 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.275273 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv"] Apr 21 04:26:38.279344 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.279320 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mnh2b"] Apr 21 04:26:38.279527 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.279505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.282620 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.282597 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.282896 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.282872 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 04:26:38.283117 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.283095 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:26:38.283403 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.283384 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 04:26:38.283513 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.283385 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-6qbjc\"" Apr 21 04:26:38.285104 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.285081 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:26:38.285337 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.285318 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bm2fq\"" Apr 21 04:26:38.285417 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.285340 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:26:38.285672 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.285651 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:26:38.289812 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.289779 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv"] Apr 21 04:26:38.300779 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.300758 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-root\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.300883 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.300787 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/344d432b-3faa-4a09-831b-42861da22f34-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.300883 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.300810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qqcg\" (UniqueName: \"kubernetes.io/projected/344d432b-3faa-4a09-831b-42861da22f34-kube-api-access-5qqcg\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.300883 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.300831 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6rz\" (UniqueName: \"kubernetes.io/projected/93066b03-f6c2-4051-8789-c5995156fed3-kube-api-access-rx6rz\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.300883 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.300875 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-accelerators-collector-config\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.301108 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.300936 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/344d432b-3faa-4a09-831b-42861da22f34-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.301108 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.300955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-tls\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.301108 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.301010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/344d432b-3faa-4a09-831b-42861da22f34-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.301108 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.301049 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-wtmp\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.301108 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.301082 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-sys\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.301108 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.301108 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-textfile\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.301377 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.301132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.301377 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.301180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93066b03-f6c2-4051-8789-c5995156fed3-metrics-client-ca\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.402315 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qqcg\" (UniqueName: \"kubernetes.io/projected/344d432b-3faa-4a09-831b-42861da22f34-kube-api-access-5qqcg\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.402489 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6rz\" (UniqueName: \"kubernetes.io/projected/93066b03-f6c2-4051-8789-c5995156fed3-kube-api-access-rx6rz\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.402569 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-accelerators-collector-config\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.402657 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/344d432b-3faa-4a09-831b-42861da22f34-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.402693 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-tls\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.402743 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/344d432b-3faa-4a09-831b-42861da22f34-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.402743 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-wtmp\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.402843 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402743 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-sys\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.402843 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402777 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-textfile\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.402843 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.403007 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402850 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93066b03-f6c2-4051-8789-c5995156fed3-metrics-client-ca\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.403007 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:38.402861 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:26:38.403007 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402885 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-root\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.403007 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.402909 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/344d432b-3faa-4a09-831b-42861da22f34-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.403007 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:38.402940 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-tls podName:93066b03-f6c2-4051-8789-c5995156fed3 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:38.902916303 +0000 UTC m=+150.898313908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-tls") pod "node-exporter-mnh2b" (UID: "93066b03-f6c2-4051-8789-c5995156fed3") : secret "node-exporter-tls" not found Apr 21 04:26:38.403308 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.403284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-wtmp\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.403360 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.403290 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-accelerators-collector-config\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.403466 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.403439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-textfile\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.404283 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.403490 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/344d432b-3faa-4a09-831b-42861da22f34-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.404283 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.403560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-root\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.404283 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.403566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93066b03-f6c2-4051-8789-c5995156fed3-sys\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.404283 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.403831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93066b03-f6c2-4051-8789-c5995156fed3-metrics-client-ca\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.405558 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.405534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/344d432b-3faa-4a09-831b-42861da22f34-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.405558 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.405543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/344d432b-3faa-4a09-831b-42861da22f34-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.405683 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.405567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.412817 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.412793 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6rz\" (UniqueName: \"kubernetes.io/projected/93066b03-f6c2-4051-8789-c5995156fed3-kube-api-access-rx6rz\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.413312 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.413291 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qqcg\" (UniqueName: \"kubernetes.io/projected/344d432b-3faa-4a09-831b-42861da22f34-kube-api-access-5qqcg\") pod \"openshift-state-metrics-9d44df66c-5kxmv\" (UID: \"344d432b-3faa-4a09-831b-42861da22f34\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.510434 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.510404 2579 scope.go:117] "RemoveContainer" containerID="b7559fadd1294a659df5e5037679e169853176873fbc34d6b9cddea55a7abb8f" Apr 21 04:26:38.510620 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:38.510607 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gl6cc_openshift-console-operator(0f1153f0-d097-4947-a74c-4824967f6184)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" podUID="0f1153f0-d097-4947-a74c-4824967f6184" Apr 21 04:26:38.592964 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.592936 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" Apr 21 04:26:38.722909 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.722826 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv"] Apr 21 04:26:38.725447 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:38.725418 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod344d432b_3faa_4a09_831b_42861da22f34.slice/crio-dce3e6912e57ed90623ba33bcadcb31311f419717a2f847c39b7af87f03b38c1 WatchSource:0}: Error finding container dce3e6912e57ed90623ba33bcadcb31311f419717a2f847c39b7af87f03b38c1: Status 404 returned error can't find the container with id dce3e6912e57ed90623ba33bcadcb31311f419717a2f847c39b7af87f03b38c1 Apr 21 04:26:38.907454 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.907357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-tls\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:38.909608 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:38.909588 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93066b03-f6c2-4051-8789-c5995156fed3-node-exporter-tls\") pod \"node-exporter-mnh2b\" (UID: \"93066b03-f6c2-4051-8789-c5995156fed3\") " pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:39.080712 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.080671 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" event={"ID":"344d432b-3faa-4a09-831b-42861da22f34","Type":"ContainerStarted","Data":"e1497ee490ecb683d5508b89e017dfc4870909164e0938906fe2378144f7845a"} Apr 21 04:26:39.080712 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.080711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" event={"ID":"344d432b-3faa-4a09-831b-42861da22f34","Type":"ContainerStarted","Data":"efcb03849f1827d0e037ddf8bbe78500e57bbfca9dc45822cd3423f89ab7c623"} Apr 21 04:26:39.080712 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.080721 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" event={"ID":"344d432b-3faa-4a09-831b-42861da22f34","Type":"ContainerStarted","Data":"dce3e6912e57ed90623ba33bcadcb31311f419717a2f847c39b7af87f03b38c1"} Apr 21 04:26:39.199674 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.199591 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mnh2b" Apr 21 04:26:39.209108 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:39.209073 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93066b03_f6c2_4051_8789_c5995156fed3.slice/crio-5ea312148613eaa6b838305b994d93674979798ca5dd7fcdcda7130cbf408186 WatchSource:0}: Error finding container 5ea312148613eaa6b838305b994d93674979798ca5dd7fcdcda7130cbf408186: Status 404 returned error can't find the container with id 5ea312148613eaa6b838305b994d93674979798ca5dd7fcdcda7130cbf408186 Apr 21 04:26:39.335470 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.335432 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:26:39.340198 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.340176 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.342422 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.342398 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 04:26:39.343070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.342791 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 04:26:39.343070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.342801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 04:26:39.343070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.342841 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cml9v\"" Apr 21 04:26:39.343070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.342904 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 04:26:39.343070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.342911 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 04:26:39.343070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.342792 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 04:26:39.343359 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.343259 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 04:26:39.343359 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.343261 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 04:26:39.343629 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.343609 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 04:26:39.352717 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.352682 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:26:39.411522 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411695 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411531 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411695 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411815 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411815 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-volume\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411815 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411961 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411833 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411961 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411961 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411894 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411961 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411924 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.411961 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411951 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-out\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.412196 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.411995 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-web-config\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.412196 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.412023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbz75\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-kube-api-access-zbz75\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513304 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513215 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513304 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-volume\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513304 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513445 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-out\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513478 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-web-config\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513502 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbz75\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-kube-api-access-zbz75\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.513670 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.513605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.515559 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.515003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.515559 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.515224 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.515559 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.515261 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.515559 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:39.515334 2579 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 21 04:26:39.515559 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:39.515400 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls podName:1703f6b0-c1fc-4050-93a7-2ba10c81efc0 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:40.015379328 +0000 UTC m=+152.010776948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0") : secret "alertmanager-main-tls" not found Apr 21 04:26:39.518968 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.518941 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.519095 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.519050 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-web-config\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.519519 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.519483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.520189 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.520136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-volume\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.520289 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.520230 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.520346 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.520329 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.521217 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.521182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.521602 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.521582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbz75\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-kube-api-access-zbz75\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:39.522828 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:39.522805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-out\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:40.019144 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:40.019094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:40.021563 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:40.021530 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:40.084376 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:40.084344 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mnh2b" event={"ID":"93066b03-f6c2-4051-8789-c5995156fed3","Type":"ContainerStarted","Data":"5ea312148613eaa6b838305b994d93674979798ca5dd7fcdcda7130cbf408186"} Apr 21 04:26:40.254461 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:40.254427 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:26:40.403790 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:40.403756 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:26:40.404861 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:40.404831 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1703f6b0_c1fc_4050_93a7_2ba10c81efc0.slice/crio-493ba14c9d2d29ae34831b2e12b7a58201f4252d6769392f20e7c094750c0ebd WatchSource:0}: Error finding container 493ba14c9d2d29ae34831b2e12b7a58201f4252d6769392f20e7c094750c0ebd: Status 404 returned error can't find the container with id 493ba14c9d2d29ae34831b2e12b7a58201f4252d6769392f20e7c094750c0ebd Apr 21 04:26:41.089256 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.089218 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" event={"ID":"344d432b-3faa-4a09-831b-42861da22f34","Type":"ContainerStarted","Data":"2bf9b39b1cf3732eba445f3b71c2be053b57ff0895e0d0fa050fdde5bcf0785a"} Apr 21 04:26:41.090319 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.090287 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerStarted","Data":"493ba14c9d2d29ae34831b2e12b7a58201f4252d6769392f20e7c094750c0ebd"} Apr 21 04:26:41.091754 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.091723 2579 generic.go:358] "Generic (PLEG): container finished" podID="93066b03-f6c2-4051-8789-c5995156fed3" containerID="6345a2a66e3ade049c211dc438b64fb47dbc1e01dfbf42d062e44a46bb21ce98" exitCode=0 Apr 21 04:26:41.091843 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.091768 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mnh2b" event={"ID":"93066b03-f6c2-4051-8789-c5995156fed3","Type":"ContainerDied","Data":"6345a2a66e3ade049c211dc438b64fb47dbc1e01dfbf42d062e44a46bb21ce98"} Apr 21 04:26:41.105960 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.105916 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5kxmv" podStartSLOduration=1.913077973 podStartE2EDuration="3.105899282s" podCreationTimestamp="2026-04-21 04:26:38 +0000 UTC" firstStartedPulling="2026-04-21 04:26:38.839800367 +0000 UTC m=+150.835197970" lastFinishedPulling="2026-04-21 04:26:40.032621677 +0000 UTC m=+152.028019279" observedRunningTime="2026-04-21 04:26:41.103898114 +0000 UTC m=+153.099295741" watchObservedRunningTime="2026-04-21 04:26:41.105899282 +0000 UTC m=+153.101296909" Apr 21 04:26:41.347189 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.347086 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-d576574b4-xrm2c"] Apr 21 04:26:41.351045 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.351017 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.353374 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.353344 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 04:26:41.353515 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.353415 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 04:26:41.353515 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.353346 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 04:26:41.353628 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.353603 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 04:26:41.353686 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.353641 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 04:26:41.353744 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.353696 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-8bmx5\"" Apr 21 04:26:41.353814 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.353792 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4mpc21qvlf6p\"" Apr 21 04:26:41.361832 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.361790 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-d576574b4-xrm2c"] Apr 21 04:26:41.438739 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.438707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.439186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.438746 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-grpc-tls\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.439186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.438776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.439186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.438922 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-tls\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.439186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.439041 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.439186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.439072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l56rl\" (UniqueName: \"kubernetes.io/projected/183f0560-9cb9-4beb-97c2-c870a1526a12-kube-api-access-l56rl\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.439186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.439144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/183f0560-9cb9-4beb-97c2-c870a1526a12-metrics-client-ca\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.439423 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.439188 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.539778 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.539743 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-tls\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.540084 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.539833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.540084 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.539864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l56rl\" (UniqueName: \"kubernetes.io/projected/183f0560-9cb9-4beb-97c2-c870a1526a12-kube-api-access-l56rl\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.540084 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.539919 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/183f0560-9cb9-4beb-97c2-c870a1526a12-metrics-client-ca\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.540084 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.539948 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.540084 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.539996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.540405 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.540351 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-grpc-tls\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.540405 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.540399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.541205 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.541178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/183f0560-9cb9-4beb-97c2-c870a1526a12-metrics-client-ca\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.543217 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.543170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-tls\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.543725 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.543674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-grpc-tls\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.544112 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.544085 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.544997 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.544962 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.545394 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.545367 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.545497 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.545476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/183f0560-9cb9-4beb-97c2-c870a1526a12-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.552003 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.551963 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l56rl\" (UniqueName: \"kubernetes.io/projected/183f0560-9cb9-4beb-97c2-c870a1526a12-kube-api-access-l56rl\") pod \"thanos-querier-d576574b4-xrm2c\" (UID: \"183f0560-9cb9-4beb-97c2-c870a1526a12\") " pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.663162 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.663067 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:41.796537 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:41.796503 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-d576574b4-xrm2c"] Apr 21 04:26:41.799792 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:41.799762 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183f0560_9cb9_4beb_97c2_c870a1526a12.slice/crio-5cc57a85fc8c475dfbf072986292dd042dceb99e655b891976c8dcc98a759ea0 WatchSource:0}: Error finding container 5cc57a85fc8c475dfbf072986292dd042dceb99e655b891976c8dcc98a759ea0: Status 404 returned error can't find the container with id 5cc57a85fc8c475dfbf072986292dd042dceb99e655b891976c8dcc98a759ea0 Apr 21 04:26:42.095908 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:42.095875 2579 generic.go:358] "Generic (PLEG): container finished" podID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerID="22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8" exitCode=0 Apr 21 04:26:42.096069 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:42.095943 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerDied","Data":"22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8"} Apr 21 04:26:42.097906 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:42.097885 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mnh2b" event={"ID":"93066b03-f6c2-4051-8789-c5995156fed3","Type":"ContainerStarted","Data":"7243c2224212de696864ac5e2efbf1dd7458bac54746e8458b021b94f990207a"} Apr 21 04:26:42.098010 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:42.097911 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mnh2b" event={"ID":"93066b03-f6c2-4051-8789-c5995156fed3","Type":"ContainerStarted","Data":"f8a2588de899e9fd29784caee9a1abd30c2e0e187ad6099b3d2692c3de8061ac"} Apr 21 04:26:42.098936 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:42.098890 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" event={"ID":"183f0560-9cb9-4beb-97c2-c870a1526a12","Type":"ContainerStarted","Data":"5cc57a85fc8c475dfbf072986292dd042dceb99e655b891976c8dcc98a759ea0"} Apr 21 04:26:42.140638 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:42.140588 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mnh2b" podStartSLOduration=3.3162834500000002 podStartE2EDuration="4.140574393s" podCreationTimestamp="2026-04-21 04:26:38 +0000 UTC" firstStartedPulling="2026-04-21 04:26:39.210831518 +0000 UTC m=+151.206229122" lastFinishedPulling="2026-04-21 04:26:40.035122462 +0000 UTC m=+152.030520065" observedRunningTime="2026-04-21 04:26:42.139523935 +0000 UTC m=+154.134921561" watchObservedRunningTime="2026-04-21 04:26:42.140574393 +0000 UTC m=+154.135972017" Apr 21 04:26:43.795484 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:43.795446 2579 patch_prober.go:28] interesting pod/image-registry-7fd975c7bf-tx9ck container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 04:26:43.795850 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:43.795509 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" podUID="8b40fb2d-b3c9-4470-9ef8-62415d18d8f3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:26:43.992497 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:43.992449 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rnhvz" podUID="12bdc8fa-32f4-4a0d-85b9-2e5c19724e76" Apr 21 04:26:44.007286 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:26:44.007254 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-p27tc" podUID="83f15b2a-86b0-4400-a5f1-ff037093ddcc" Apr 21 04:26:44.106799 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:44.106744 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" event={"ID":"183f0560-9cb9-4beb-97c2-c870a1526a12","Type":"ContainerStarted","Data":"5ab5e1c24e53364dd6275bf297db681cffa0c5c9974c0a5c96efd27ea7fed0af"} Apr 21 04:26:44.109132 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:44.109096 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerStarted","Data":"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937"} Apr 21 04:26:44.109132 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:44.109122 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rnhvz" Apr 21 04:26:45.113049 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:45.113015 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" event={"ID":"183f0560-9cb9-4beb-97c2-c870a1526a12","Type":"ContainerStarted","Data":"b34a42c1b5c0ec45f8fa8584d0ea259d96642ac987549739d5572041922a643f"} Apr 21 04:26:45.113387 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:45.113059 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" event={"ID":"183f0560-9cb9-4beb-97c2-c870a1526a12","Type":"ContainerStarted","Data":"bb5fb1511d02fc22c1d8b85ba792e6a821ec54c8114d411d9dec64436bfa84f3"} Apr 21 04:26:45.115388 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:45.115361 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerStarted","Data":"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7"} Apr 21 04:26:45.115488 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:45.115394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerStarted","Data":"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a"} Apr 21 04:26:45.115488 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:45.115409 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerStarted","Data":"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d"} Apr 21 04:26:45.115488 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:45.115420 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerStarted","Data":"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9"} Apr 21 04:26:46.043122 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:46.043088 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7fd975c7bf-tx9ck" Apr 21 04:26:46.120549 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:46.120507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerStarted","Data":"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e"} Apr 21 04:26:46.123054 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:46.123029 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" event={"ID":"183f0560-9cb9-4beb-97c2-c870a1526a12","Type":"ContainerStarted","Data":"42beb809696c2d13f416526077a1418f740f908c42f4d242ce6737f00a9c7ee7"} Apr 21 04:26:46.123054 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:46.123058 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" event={"ID":"183f0560-9cb9-4beb-97c2-c870a1526a12","Type":"ContainerStarted","Data":"03249f5dd7e2da8a492cbb912980176fdfcb07b88dbb7156a6f52caec48ca336"} Apr 21 04:26:46.123219 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:46.123067 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" event={"ID":"183f0560-9cb9-4beb-97c2-c870a1526a12","Type":"ContainerStarted","Data":"2b35735d4e62a9d524da736f7954ba7d3767632b0569fd69ad91a127351ea0ad"} Apr 21 04:26:46.123219 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:46.123207 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:46.147560 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:46.147512 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.4774808999999998 podStartE2EDuration="7.147497836s" podCreationTimestamp="2026-04-21 04:26:39 +0000 UTC" firstStartedPulling="2026-04-21 04:26:40.406837378 +0000 UTC m=+152.402234996" lastFinishedPulling="2026-04-21 04:26:45.076854326 +0000 UTC m=+157.072251932" observedRunningTime="2026-04-21 04:26:46.146451533 +0000 UTC m=+158.141849158" watchObservedRunningTime="2026-04-21 04:26:46.147497836 +0000 UTC m=+158.142895460" Apr 21 04:26:46.166458 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:46.166397 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" podStartSLOduration=1.8893870179999999 podStartE2EDuration="5.166369487s" podCreationTimestamp="2026-04-21 04:26:41 +0000 UTC" firstStartedPulling="2026-04-21 04:26:41.801601648 +0000 UTC m=+153.796999255" lastFinishedPulling="2026-04-21 04:26:45.078584118 +0000 UTC m=+157.073981724" observedRunningTime="2026-04-21 04:26:46.16536434 +0000 UTC m=+158.160761964" watchObservedRunningTime="2026-04-21 04:26:46.166369487 +0000 UTC m=+158.161767112" Apr 21 04:26:48.810885 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:48.810843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:26:48.813253 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:48.813232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12bdc8fa-32f4-4a0d-85b9-2e5c19724e76-metrics-tls\") pod \"dns-default-rnhvz\" (UID: \"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76\") " pod="openshift-dns/dns-default-rnhvz" Apr 21 04:26:48.912021 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:48.911968 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vk74d\"" Apr 21 04:26:48.912021 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:48.912008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:26:48.914325 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:48.914305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83f15b2a-86b0-4400-a5f1-ff037093ddcc-cert\") pod \"ingress-canary-p27tc\" (UID: \"83f15b2a-86b0-4400-a5f1-ff037093ddcc\") " pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:26:48.920205 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:48.920185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rnhvz" Apr 21 04:26:49.038866 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.038829 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rnhvz"] Apr 21 04:26:49.041798 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:49.041763 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12bdc8fa_32f4_4a0d_85b9_2e5c19724e76.slice/crio-1759239689d61caf8ef3c7ae4aad995f9ddaaeb17391e23322b674f487e671db WatchSource:0}: Error finding container 1759239689d61caf8ef3c7ae4aad995f9ddaaeb17391e23322b674f487e671db: Status 404 returned error can't find the container with id 1759239689d61caf8ef3c7ae4aad995f9ddaaeb17391e23322b674f487e671db Apr 21 04:26:49.132035 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.132004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rnhvz" event={"ID":"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76","Type":"ContainerStarted","Data":"1759239689d61caf8ef3c7ae4aad995f9ddaaeb17391e23322b674f487e671db"} Apr 21 04:26:49.508468 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.508389 2579 scope.go:117] "RemoveContainer" containerID="b7559fadd1294a659df5e5037679e169853176873fbc34d6b9cddea55a7abb8f" Apr 21 04:26:49.974967 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.974909 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-l2wqz"] Apr 21 04:26:49.978631 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.978606 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-l2wqz" Apr 21 04:26:49.980890 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.980864 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 04:26:49.981060 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.980927 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-d4dxd\"" Apr 21 04:26:49.981060 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.980960 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 04:26:49.987262 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:49.987042 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-l2wqz"] Apr 21 04:26:50.023941 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.023811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dfk\" (UniqueName: \"kubernetes.io/projected/3f33fd64-8497-4fb4-8b9b-6b34fd8564fa-kube-api-access-96dfk\") pod \"downloads-6bcc868b7-l2wqz\" (UID: \"3f33fd64-8497-4fb4-8b9b-6b34fd8564fa\") " pod="openshift-console/downloads-6bcc868b7-l2wqz" Apr 21 04:26:50.125039 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.124975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96dfk\" (UniqueName: \"kubernetes.io/projected/3f33fd64-8497-4fb4-8b9b-6b34fd8564fa-kube-api-access-96dfk\") pod \"downloads-6bcc868b7-l2wqz\" (UID: \"3f33fd64-8497-4fb4-8b9b-6b34fd8564fa\") " pod="openshift-console/downloads-6bcc868b7-l2wqz" Apr 21 04:26:50.133638 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.133607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dfk\" (UniqueName: \"kubernetes.io/projected/3f33fd64-8497-4fb4-8b9b-6b34fd8564fa-kube-api-access-96dfk\") pod \"downloads-6bcc868b7-l2wqz\" (UID: \"3f33fd64-8497-4fb4-8b9b-6b34fd8564fa\") " pod="openshift-console/downloads-6bcc868b7-l2wqz" Apr 21 04:26:50.137072 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.137050 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:26:50.137218 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.137126 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" event={"ID":"0f1153f0-d097-4947-a74c-4824967f6184","Type":"ContainerStarted","Data":"71b81aa44a4cdce22e20de107921188240675d93e47c3a7b60fad44c41f73ad1"} Apr 21 04:26:50.137613 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.137591 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:26:50.143221 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.143198 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" Apr 21 04:26:50.153563 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.153495 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-gl6cc" podStartSLOduration=51.838329687 podStartE2EDuration="54.153473737s" podCreationTimestamp="2026-04-21 04:25:56 +0000 UTC" firstStartedPulling="2026-04-21 04:25:56.891781795 +0000 UTC m=+108.887179411" lastFinishedPulling="2026-04-21 04:25:59.206925858 +0000 UTC m=+111.202323461" observedRunningTime="2026-04-21 04:26:50.152504409 +0000 UTC m=+162.147902036" watchObservedRunningTime="2026-04-21 04:26:50.153473737 +0000 UTC m=+162.148871362" Apr 21 04:26:50.290266 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.290175 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-l2wqz" Apr 21 04:26:50.444726 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:50.444699 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-l2wqz"] Apr 21 04:26:50.446830 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:26:50.446796 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f33fd64_8497_4fb4_8b9b_6b34fd8564fa.slice/crio-28972f94885113c9b5af110557f5a85636a7f95b7ba139e60c693ecf47164a6e WatchSource:0}: Error finding container 28972f94885113c9b5af110557f5a85636a7f95b7ba139e60c693ecf47164a6e: Status 404 returned error can't find the container with id 28972f94885113c9b5af110557f5a85636a7f95b7ba139e60c693ecf47164a6e Apr 21 04:26:51.142239 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:51.142198 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rnhvz" event={"ID":"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76","Type":"ContainerStarted","Data":"c338ba05076fe5393224fb509c56584fbe686b6b79d9123b575beeff351c5539"} Apr 21 04:26:51.142239 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:51.142235 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rnhvz" event={"ID":"12bdc8fa-32f4-4a0d-85b9-2e5c19724e76","Type":"ContainerStarted","Data":"9004c041826aa3c1b70385a1c354b3bf4b5c97474389e27322392f893e56259f"} Apr 21 04:26:51.142733 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:51.142332 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rnhvz" Apr 21 04:26:51.143622 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:51.143590 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-l2wqz" event={"ID":"3f33fd64-8497-4fb4-8b9b-6b34fd8564fa","Type":"ContainerStarted","Data":"28972f94885113c9b5af110557f5a85636a7f95b7ba139e60c693ecf47164a6e"} Apr 21 04:26:51.160178 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:51.160118 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rnhvz" podStartSLOduration=129.82999249 podStartE2EDuration="2m11.160102914s" podCreationTimestamp="2026-04-21 04:24:40 +0000 UTC" firstStartedPulling="2026-04-21 04:26:49.043721513 +0000 UTC m=+161.039119116" lastFinishedPulling="2026-04-21 04:26:50.373831933 +0000 UTC m=+162.369229540" observedRunningTime="2026-04-21 04:26:51.15796353 +0000 UTC m=+163.153361156" watchObservedRunningTime="2026-04-21 04:26:51.160102914 +0000 UTC m=+163.155500538" Apr 21 04:26:52.133353 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:52.133320 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-d576574b4-xrm2c" Apr 21 04:26:59.508284 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:59.508190 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:26:59.511211 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:59.511186 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l78gs\"" Apr 21 04:26:59.519253 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:26:59.519229 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p27tc" Apr 21 04:27:00.089312 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.089279 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b584c4879-5d68q"] Apr 21 04:27:00.094305 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.094280 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.096890 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.096863 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nrbhv\"" Apr 21 04:27:00.097030 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.096865 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 04:27:00.097030 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.096911 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 04:27:00.097765 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.097746 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 04:27:00.098079 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.098061 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 04:27:00.098197 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.098174 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 04:27:00.102049 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.102025 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b584c4879-5d68q"] Apr 21 04:27:00.219190 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.219145 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-serving-cert\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.219190 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.219198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-oauth-serving-cert\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.219440 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.219259 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-config\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.219440 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.219378 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-oauth-config\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.219440 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.219403 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-service-ca\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.219596 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.219458 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8qvp\" (UniqueName: \"kubernetes.io/projected/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-kube-api-access-w8qvp\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.320767 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.320723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-oauth-config\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.320972 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.320780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-service-ca\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.321080 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.320964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8qvp\" (UniqueName: \"kubernetes.io/projected/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-kube-api-access-w8qvp\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.321080 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.321054 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-serving-cert\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.321186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.321084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-oauth-serving-cert\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.321186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.321109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-config\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.321573 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.321542 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-service-ca\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.321871 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.321851 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-oauth-serving-cert\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.322191 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.322168 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-config\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.323782 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.323759 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-oauth-config\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.323895 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.323760 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-serving-cert\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.329297 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.329270 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8qvp\" (UniqueName: \"kubernetes.io/projected/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-kube-api-access-w8qvp\") pod \"console-6b584c4879-5d68q\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:00.406809 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:00.406719 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:01.149551 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:01.149518 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rnhvz" Apr 21 04:27:05.884732 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:05.884482 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b584c4879-5d68q"] Apr 21 04:27:05.886729 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:27:05.886698 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1eb0263_8ebc_469b_aef0_bb6a5c86cec8.slice/crio-2c834073a5b5c1d0c80051ad46d68bd2a85a2479d02b40e0aeba8727aaea1944 WatchSource:0}: Error finding container 2c834073a5b5c1d0c80051ad46d68bd2a85a2479d02b40e0aeba8727aaea1944: Status 404 returned error can't find the container with id 2c834073a5b5c1d0c80051ad46d68bd2a85a2479d02b40e0aeba8727aaea1944 Apr 21 04:27:05.904685 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:05.904656 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p27tc"] Apr 21 04:27:05.910108 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:27:05.910080 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f15b2a_86b0_4400_a5f1_ff037093ddcc.slice/crio-319330d40eff5272d27135bcb74c5c1fad9db7dd5d530e1610143792987b9ada WatchSource:0}: Error finding container 319330d40eff5272d27135bcb74c5c1fad9db7dd5d530e1610143792987b9ada: Status 404 returned error can't find the container with id 319330d40eff5272d27135bcb74c5c1fad9db7dd5d530e1610143792987b9ada Apr 21 04:27:06.194977 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:06.194940 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-l2wqz" event={"ID":"3f33fd64-8497-4fb4-8b9b-6b34fd8564fa","Type":"ContainerStarted","Data":"bb0a7088d4d4eb10f7959e3713133b92822077644daabc2fdfe0e96ed61ad949"} Apr 21 04:27:06.195223 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:06.195202 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-l2wqz" Apr 21 04:27:06.196299 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:06.196273 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b584c4879-5d68q" event={"ID":"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8","Type":"ContainerStarted","Data":"2c834073a5b5c1d0c80051ad46d68bd2a85a2479d02b40e0aeba8727aaea1944"} Apr 21 04:27:06.197500 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:06.197472 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p27tc" event={"ID":"83f15b2a-86b0-4400-a5f1-ff037093ddcc","Type":"ContainerStarted","Data":"319330d40eff5272d27135bcb74c5c1fad9db7dd5d530e1610143792987b9ada"} Apr 21 04:27:06.211842 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:06.211811 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-l2wqz" Apr 21 04:27:06.213531 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:06.213474 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-l2wqz" podStartSLOduration=1.818260706 podStartE2EDuration="17.213459613s" podCreationTimestamp="2026-04-21 04:26:49 +0000 UTC" firstStartedPulling="2026-04-21 04:26:50.449501151 +0000 UTC m=+162.444898768" lastFinishedPulling="2026-04-21 04:27:05.844700052 +0000 UTC m=+177.840097675" observedRunningTime="2026-04-21 04:27:06.21104653 +0000 UTC m=+178.206444158" watchObservedRunningTime="2026-04-21 04:27:06.213459613 +0000 UTC m=+178.208857282" Apr 21 04:27:08.876505 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:08.876463 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5876849554-xldwj"] Apr 21 04:27:08.913339 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:08.913304 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5876849554-xldwj"] Apr 21 04:27:08.913519 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:08.913497 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:08.921262 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:08.921202 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 04:27:09.010760 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.010725 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-serving-cert\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.010760 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.010761 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qgh9\" (UniqueName: \"kubernetes.io/projected/d0847388-26b8-46fe-a68b-b593a7bbac48-kube-api-access-5qgh9\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.010972 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.010794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-oauth-serving-cert\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.010972 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.010877 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-console-config\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.010972 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.010924 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-trusted-ca-bundle\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.010972 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.010945 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-oauth-config\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.010972 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.010965 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-service-ca\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.111478 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.111435 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-console-config\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.111681 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.111510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-trusted-ca-bundle\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.111681 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.111536 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-oauth-config\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.111681 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.111558 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-service-ca\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.111681 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.111610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-serving-cert\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.111681 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.111633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qgh9\" (UniqueName: \"kubernetes.io/projected/d0847388-26b8-46fe-a68b-b593a7bbac48-kube-api-access-5qgh9\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.111681 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.111672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-oauth-serving-cert\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.112500 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.112432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-oauth-serving-cert\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.114078 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.113327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-console-config\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.114078 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.113658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-service-ca\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.114078 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.114038 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-trusted-ca-bundle\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.115065 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.115042 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-oauth-config\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.116585 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.116560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-serving-cert\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.129504 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.129331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qgh9\" (UniqueName: \"kubernetes.io/projected/d0847388-26b8-46fe-a68b-b593a7bbac48-kube-api-access-5qgh9\") pod \"console-5876849554-xldwj\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:09.227347 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:09.227198 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:10.005532 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.005480 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5876849554-xldwj"] Apr 21 04:27:10.010161 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:27:10.010131 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0847388_26b8_46fe_a68b_b593a7bbac48.slice/crio-b75a05e4d4bf6181aea4188e59e21815541b1061a8a28667d4ddc38ca9b48f43 WatchSource:0}: Error finding container b75a05e4d4bf6181aea4188e59e21815541b1061a8a28667d4ddc38ca9b48f43: Status 404 returned error can't find the container with id b75a05e4d4bf6181aea4188e59e21815541b1061a8a28667d4ddc38ca9b48f43 Apr 21 04:27:10.214639 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.214588 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5876849554-xldwj" event={"ID":"d0847388-26b8-46fe-a68b-b593a7bbac48","Type":"ContainerStarted","Data":"eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c"} Apr 21 04:27:10.214639 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.214634 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5876849554-xldwj" event={"ID":"d0847388-26b8-46fe-a68b-b593a7bbac48","Type":"ContainerStarted","Data":"b75a05e4d4bf6181aea4188e59e21815541b1061a8a28667d4ddc38ca9b48f43"} Apr 21 04:27:10.216152 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.216128 2579 generic.go:358] "Generic (PLEG): container finished" podID="99b02ee4-b901-4a4f-9b7a-2ddd30492de6" containerID="e90fe192c78fbf9729342c994c38d0605742b699722ea36a5754876421d583da" exitCode=0 Apr 21 04:27:10.216274 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.216204 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-f96bf" event={"ID":"99b02ee4-b901-4a4f-9b7a-2ddd30492de6","Type":"ContainerDied","Data":"e90fe192c78fbf9729342c994c38d0605742b699722ea36a5754876421d583da"} Apr 21 04:27:10.216581 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.216556 2579 scope.go:117] "RemoveContainer" containerID="e90fe192c78fbf9729342c994c38d0605742b699722ea36a5754876421d583da" Apr 21 04:27:10.217892 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.217865 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b584c4879-5d68q" event={"ID":"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8","Type":"ContainerStarted","Data":"44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4"} Apr 21 04:27:10.219611 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.219593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p27tc" event={"ID":"83f15b2a-86b0-4400-a5f1-ff037093ddcc","Type":"ContainerStarted","Data":"0f6a18d84eeeb2baf01c401af77733a88fc938fab7d84e3e7ed4c0b65bb197b3"} Apr 21 04:27:10.238722 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.238669 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5876849554-xldwj" podStartSLOduration=2.238651136 podStartE2EDuration="2.238651136s" podCreationTimestamp="2026-04-21 04:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:27:10.23593973 +0000 UTC m=+182.231337380" watchObservedRunningTime="2026-04-21 04:27:10.238651136 +0000 UTC m=+182.234048762" Apr 21 04:27:10.250851 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.250805 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p27tc" podStartSLOduration=146.302177297 podStartE2EDuration="2m30.250791547s" podCreationTimestamp="2026-04-21 04:24:40 +0000 UTC" firstStartedPulling="2026-04-21 04:27:05.911970853 +0000 UTC m=+177.907368457" lastFinishedPulling="2026-04-21 04:27:09.860585099 +0000 UTC m=+181.855982707" observedRunningTime="2026-04-21 04:27:10.250385705 +0000 UTC m=+182.245783331" watchObservedRunningTime="2026-04-21 04:27:10.250791547 +0000 UTC m=+182.246189172" Apr 21 04:27:10.282523 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.282472 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b584c4879-5d68q" podStartSLOduration=6.304416784 podStartE2EDuration="10.28245322s" podCreationTimestamp="2026-04-21 04:27:00 +0000 UTC" firstStartedPulling="2026-04-21 04:27:05.888903276 +0000 UTC m=+177.884300879" lastFinishedPulling="2026-04-21 04:27:09.866939711 +0000 UTC m=+181.862337315" observedRunningTime="2026-04-21 04:27:10.282115751 +0000 UTC m=+182.277513374" watchObservedRunningTime="2026-04-21 04:27:10.28245322 +0000 UTC m=+182.277850826" Apr 21 04:27:10.407783 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.407690 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:10.407783 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.407750 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:10.413667 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:10.413642 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:11.227720 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:11.227003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-f96bf" event={"ID":"99b02ee4-b901-4a4f-9b7a-2ddd30492de6","Type":"ContainerStarted","Data":"746781e1be3bfd1de8489d2b616c4e1361f5f98773cd0a29f7fb8e89cf0d9541"} Apr 21 04:27:11.238624 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:11.238588 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:11.327797 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:11.327746 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-f8d5bbc98-2ffvt_e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5/router/0.log" Apr 21 04:27:11.333102 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:11.333076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p27tc_83f15b2a-86b0-4400-a5f1-ff037093ddcc/serve-healthcheck-canary/0.log" Apr 21 04:27:19.227664 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:19.227619 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:19.228303 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:19.227786 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:19.232370 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:19.232351 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:19.255612 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:19.255585 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5876849554-xldwj" Apr 21 04:27:19.305393 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:19.305361 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b584c4879-5d68q"] Apr 21 04:27:44.325489 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.325430 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b584c4879-5d68q" podUID="e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" containerName="console" containerID="cri-o://44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4" gracePeriod=15 Apr 21 04:27:44.595145 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.595124 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b584c4879-5d68q_e1eb0263-8ebc-469b-aef0-bb6a5c86cec8/console/0.log" Apr 21 04:27:44.595257 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.595195 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:44.724415 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724379 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-oauth-config\") pod \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " Apr 21 04:27:44.724578 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724422 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-service-ca\") pod \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " Apr 21 04:27:44.724578 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724463 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8qvp\" (UniqueName: \"kubernetes.io/projected/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-kube-api-access-w8qvp\") pod \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " Apr 21 04:27:44.724578 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724507 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-config\") pod \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " Apr 21 04:27:44.724578 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724534 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-oauth-serving-cert\") pod \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " Apr 21 04:27:44.724578 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724558 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-serving-cert\") pod \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\" (UID: \"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8\") " Apr 21 04:27:44.724942 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724910 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-config" (OuterVolumeSpecName: "console-config") pod "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" (UID: "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:27:44.724942 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724927 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-service-ca" (OuterVolumeSpecName: "service-ca") pod "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" (UID: "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:27:44.725072 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.724930 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" (UID: "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:27:44.726720 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.726685 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" (UID: "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:44.726816 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.726700 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" (UID: "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:44.726816 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.726767 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-kube-api-access-w8qvp" (OuterVolumeSpecName: "kube-api-access-w8qvp") pod "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" (UID: "e1eb0263-8ebc-469b-aef0-bb6a5c86cec8"). InnerVolumeSpecName "kube-api-access-w8qvp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:27:44.825471 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.825438 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:44.825471 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.825468 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-oauth-serving-cert\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:44.825471 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.825479 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-serving-cert\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:44.825710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.825488 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-console-oauth-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:44.825710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.825498 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-service-ca\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:44.825710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:44.825506 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8qvp\" (UniqueName: \"kubernetes.io/projected/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8-kube-api-access-w8qvp\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:45.328937 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.328911 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b584c4879-5d68q_e1eb0263-8ebc-469b-aef0-bb6a5c86cec8/console/0.log" Apr 21 04:27:45.329437 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.328953 2579 generic.go:358] "Generic (PLEG): container finished" podID="e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" containerID="44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4" exitCode=2 Apr 21 04:27:45.329437 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.329000 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b584c4879-5d68q" event={"ID":"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8","Type":"ContainerDied","Data":"44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4"} Apr 21 04:27:45.329437 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.329046 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b584c4879-5d68q" event={"ID":"e1eb0263-8ebc-469b-aef0-bb6a5c86cec8","Type":"ContainerDied","Data":"2c834073a5b5c1d0c80051ad46d68bd2a85a2479d02b40e0aeba8727aaea1944"} Apr 21 04:27:45.329437 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.329062 2579 scope.go:117] "RemoveContainer" containerID="44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4" Apr 21 04:27:45.329437 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.329079 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b584c4879-5d68q" Apr 21 04:27:45.337590 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.337575 2579 scope.go:117] "RemoveContainer" containerID="44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4" Apr 21 04:27:45.337883 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:27:45.337862 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4\": container with ID starting with 44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4 not found: ID does not exist" containerID="44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4" Apr 21 04:27:45.337972 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.337890 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4"} err="failed to get container status \"44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4\": rpc error: code = NotFound desc = could not find container \"44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4\": container with ID starting with 44a5f71f196a90a4d77554cd0bdf4c134cb28e8def0621dd8c6d4f31c6161be4 not found: ID does not exist" Apr 21 04:27:45.349296 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.349271 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b584c4879-5d68q"] Apr 21 04:27:45.351905 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:45.351885 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b584c4879-5d68q"] Apr 21 04:27:46.513129 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:46.513093 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" path="/var/lib/kubelet/pods/e1eb0263-8ebc-469b-aef0-bb6a5c86cec8/volumes" Apr 21 04:27:58.513367 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:58.513332 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:27:58.513956 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:58.513743 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="alertmanager" containerID="cri-o://97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937" gracePeriod=120 Apr 21 04:27:58.513956 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:58.513812 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy-metric" containerID="cri-o://ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7" gracePeriod=120 Apr 21 04:27:58.513956 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:58.513858 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy-web" containerID="cri-o://0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d" gracePeriod=120 Apr 21 04:27:58.513956 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:58.513840 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="config-reloader" containerID="cri-o://1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9" gracePeriod=120 Apr 21 04:27:58.513956 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:58.513879 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy" containerID="cri-o://56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a" gracePeriod=120 Apr 21 04:27:58.514281 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:58.513967 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="prom-label-proxy" containerID="cri-o://b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e" gracePeriod=120 Apr 21 04:27:59.379751 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.379714 2579 generic.go:358] "Generic (PLEG): container finished" podID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerID="b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e" exitCode=0 Apr 21 04:27:59.379751 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.379741 2579 generic.go:358] "Generic (PLEG): container finished" podID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerID="56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a" exitCode=0 Apr 21 04:27:59.379751 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.379748 2579 generic.go:358] "Generic (PLEG): container finished" podID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerID="1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9" exitCode=0 Apr 21 04:27:59.379751 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.379754 2579 generic.go:358] "Generic (PLEG): container finished" podID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerID="97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937" exitCode=0 Apr 21 04:27:59.380079 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.379787 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerDied","Data":"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e"} Apr 21 04:27:59.380079 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.379825 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerDied","Data":"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a"} Apr 21 04:27:59.380079 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.379839 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerDied","Data":"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9"} Apr 21 04:27:59.380079 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.379850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerDied","Data":"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937"} Apr 21 04:27:59.747768 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.747748 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:27:59.759558 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759533 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-metrics-client-ca\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759676 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759570 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-cluster-tls-config\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759676 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759629 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759676 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759652 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759679 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-out\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759709 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-main-db\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759741 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-trusted-ca-bundle\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759770 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-web-config\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759802 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-volume\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.759826 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759825 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.760160 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759903 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-web\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.760160 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759927 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbz75\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-kube-api-access-zbz75\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.760160 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759946 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:27:59.760160 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.759958 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-tls-assets\") pod \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\" (UID: \"1703f6b0-c1fc-4050-93a7-2ba10c81efc0\") " Apr 21 04:27:59.760710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.760417 2579 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-metrics-client-ca\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.760710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.760472 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:27:59.760710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.760653 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:27:59.763153 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.763119 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-volume" (OuterVolumeSpecName: "config-volume") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:59.763270 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.763199 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-out" (OuterVolumeSpecName: "config-out") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:27:59.763270 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.763239 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:59.763377 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.763271 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:27:59.763699 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.763669 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:59.763781 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.763756 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:59.764598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.764570 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:59.764925 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.764905 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-kube-api-access-zbz75" (OuterVolumeSpecName: "kube-api-access-zbz75") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "kube-api-access-zbz75". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:27:59.766917 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.766888 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:59.773319 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.773290 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-web-config" (OuterVolumeSpecName: "web-config") pod "1703f6b0-c1fc-4050-93a7-2ba10c81efc0" (UID: "1703f6b0-c1fc-4050-93a7-2ba10c81efc0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:59.861186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861153 2579 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-cluster-tls-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861180 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861186 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861192 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861201 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-out\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861212 2579 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-main-db\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861221 2579 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861230 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-web-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861239 2579 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-config-volume\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861247 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-main-tls\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861256 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861266 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbz75\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-kube-api-access-zbz75\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:27:59.861416 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:27:59.861275 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1703f6b0-c1fc-4050-93a7-2ba10c81efc0-tls-assets\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:28:00.385711 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.385677 2579 generic.go:358] "Generic (PLEG): container finished" podID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerID="ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7" exitCode=0 Apr 21 04:28:00.385711 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.385703 2579 generic.go:358] "Generic (PLEG): container finished" podID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerID="0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d" exitCode=0 Apr 21 04:28:00.385976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.385796 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerDied","Data":"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7"} Apr 21 04:28:00.385976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.385802 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.385976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.385830 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerDied","Data":"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d"} Apr 21 04:28:00.385976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.385843 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1703f6b0-c1fc-4050-93a7-2ba10c81efc0","Type":"ContainerDied","Data":"493ba14c9d2d29ae34831b2e12b7a58201f4252d6769392f20e7c094750c0ebd"} Apr 21 04:28:00.385976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.385858 2579 scope.go:117] "RemoveContainer" containerID="b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e" Apr 21 04:28:00.393472 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.393453 2579 scope.go:117] "RemoveContainer" containerID="ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7" Apr 21 04:28:00.400707 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.400686 2579 scope.go:117] "RemoveContainer" containerID="56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a" Apr 21 04:28:00.407398 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.407375 2579 scope.go:117] "RemoveContainer" containerID="0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d" Apr 21 04:28:00.409396 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.409373 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:28:00.413497 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.413472 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:28:00.414803 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.414780 2579 scope.go:117] "RemoveContainer" containerID="1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9" Apr 21 04:28:00.421313 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.421297 2579 scope.go:117] "RemoveContainer" containerID="97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937" Apr 21 04:28:00.427760 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.427746 2579 scope.go:117] "RemoveContainer" containerID="22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8" Apr 21 04:28:00.434230 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.434214 2579 scope.go:117] "RemoveContainer" containerID="b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e" Apr 21 04:28:00.434457 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:28:00.434439 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e\": container with ID starting with b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e not found: ID does not exist" containerID="b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e" Apr 21 04:28:00.434507 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.434472 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e"} err="failed to get container status \"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e\": rpc error: code = NotFound desc = could not find container \"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e\": container with ID starting with b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e not found: ID does not exist" Apr 21 04:28:00.434507 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.434490 2579 scope.go:117] "RemoveContainer" containerID="ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7" Apr 21 04:28:00.434743 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:28:00.434719 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7\": container with ID starting with ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7 not found: ID does not exist" containerID="ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7" Apr 21 04:28:00.434784 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.434755 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7"} err="failed to get container status \"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7\": rpc error: code = NotFound desc = could not find container \"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7\": container with ID starting with ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7 not found: ID does not exist" Apr 21 04:28:00.434784 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.434780 2579 scope.go:117] "RemoveContainer" containerID="56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a" Apr 21 04:28:00.435124 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:28:00.435092 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a\": container with ID starting with 56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a not found: ID does not exist" containerID="56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a" Apr 21 04:28:00.435197 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.435122 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a"} err="failed to get container status \"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a\": rpc error: code = NotFound desc = could not find container \"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a\": container with ID starting with 56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a not found: ID does not exist" Apr 21 04:28:00.435197 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.435142 2579 scope.go:117] "RemoveContainer" containerID="0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d" Apr 21 04:28:00.435516 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:28:00.435492 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d\": container with ID starting with 0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d not found: ID does not exist" containerID="0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d" Apr 21 04:28:00.435590 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.435521 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d"} err="failed to get container status \"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d\": rpc error: code = NotFound desc = could not find container \"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d\": container with ID starting with 0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d not found: ID does not exist" Apr 21 04:28:00.435590 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.435541 2579 scope.go:117] "RemoveContainer" containerID="1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9" Apr 21 04:28:00.435831 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:28:00.435811 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9\": container with ID starting with 1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9 not found: ID does not exist" containerID="1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9" Apr 21 04:28:00.435913 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.435840 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9"} err="failed to get container status \"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9\": rpc error: code = NotFound desc = could not find container \"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9\": container with ID starting with 1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9 not found: ID does not exist" Apr 21 04:28:00.435913 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.435863 2579 scope.go:117] "RemoveContainer" containerID="97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937" Apr 21 04:28:00.436371 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:28:00.436347 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937\": container with ID starting with 97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937 not found: ID does not exist" containerID="97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937" Apr 21 04:28:00.436471 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.436378 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937"} err="failed to get container status \"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937\": rpc error: code = NotFound desc = could not find container \"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937\": container with ID starting with 97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937 not found: ID does not exist" Apr 21 04:28:00.436471 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.436397 2579 scope.go:117] "RemoveContainer" containerID="22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8" Apr 21 04:28:00.436740 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:28:00.436721 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8\": container with ID starting with 22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8 not found: ID does not exist" containerID="22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8" Apr 21 04:28:00.436816 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.436749 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8"} err="failed to get container status \"22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8\": rpc error: code = NotFound desc = could not find container \"22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8\": container with ID starting with 22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8 not found: ID does not exist" Apr 21 04:28:00.436816 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.436767 2579 scope.go:117] "RemoveContainer" containerID="b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e" Apr 21 04:28:00.437037 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437017 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e"} err="failed to get container status \"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e\": rpc error: code = NotFound desc = could not find container \"b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e\": container with ID starting with b97811537cf38c57ac99440a2c669e3ff9d7d08a4b952d8e3a24f9916e6f597e not found: ID does not exist" Apr 21 04:28:00.437122 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437039 2579 scope.go:117] "RemoveContainer" containerID="ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7" Apr 21 04:28:00.437198 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437178 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:28:00.437284 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437263 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7"} err="failed to get container status \"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7\": rpc error: code = NotFound desc = could not find container \"ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7\": container with ID starting with ace46e33d78e803017dbd176f51a869a4cbc9e36ebad3227a9a77e8e2054bbd7 not found: ID does not exist" Apr 21 04:28:00.437334 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437287 2579 scope.go:117] "RemoveContainer" containerID="56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a" Apr 21 04:28:00.437487 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437474 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" containerName="console" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437489 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" containerName="console" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437500 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="alertmanager" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437507 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="alertmanager" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437516 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="prom-label-proxy" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437522 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="prom-label-proxy" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437518 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a"} err="failed to get container status \"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a\": rpc error: code = NotFound desc = could not find container \"56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a\": container with ID starting with 56faf0f99f41ce74fd6e5f83634ab548ed41bfe3826504d1b002b32081b4145a not found: ID does not exist" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437529 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437535 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy" Apr 21 04:28:00.437543 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437534 2579 scope.go:117] "RemoveContainer" containerID="0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437546 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="init-config-reloader" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437598 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="init-config-reloader" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437609 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="config-reloader" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437614 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="config-reloader" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437637 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy-web" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437644 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy-web" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437656 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy-metric" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437662 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy-metric" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437753 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy-web" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437765 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1eb0263-8ebc-469b-aef0-bb6a5c86cec8" containerName="console" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437776 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="prom-label-proxy" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437784 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="alertmanager" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437792 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="config-reloader" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437799 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy-metric" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437795 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d"} err="failed to get container status \"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d\": rpc error: code = NotFound desc = could not find container \"0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d\": container with ID starting with 0e60b394bfe3b87715c92ad177475df3a6cc62da929d4491769273d444d1898d not found: ID does not exist" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437814 2579 scope.go:117] "RemoveContainer" containerID="1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9" Apr 21 04:28:00.437853 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.437805 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" containerName="kube-rbac-proxy" Apr 21 04:28:00.438473 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.438102 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9"} err="failed to get container status \"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9\": rpc error: code = NotFound desc = could not find container \"1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9\": container with ID starting with 1e6c415f96e8f8f3053f49168f711dc4fddc7016913b88c053de862ee12257a9 not found: ID does not exist" Apr 21 04:28:00.438473 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.438127 2579 scope.go:117] "RemoveContainer" containerID="97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937" Apr 21 04:28:00.438473 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.438363 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937"} err="failed to get container status \"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937\": rpc error: code = NotFound desc = could not find container \"97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937\": container with ID starting with 97805e1bd951581f15bb1488b02de95c4872f41323da8e6790f98e39450d0937 not found: ID does not exist" Apr 21 04:28:00.438473 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.438387 2579 scope.go:117] "RemoveContainer" containerID="22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8" Apr 21 04:28:00.438630 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.438604 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8"} err="failed to get container status \"22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8\": rpc error: code = NotFound desc = could not find container \"22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8\": container with ID starting with 22269110a7c5e55e14ae789a9d443f0cf1b7999e2c77a263c1bfa56f6eab84a8 not found: ID does not exist" Apr 21 04:28:00.443146 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.443126 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.445417 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.445394 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 04:28:00.445510 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.445429 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 04:28:00.445510 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.445431 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 04:28:00.445649 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.445635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 04:28:00.445805 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.445778 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 04:28:00.445902 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.445788 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 04:28:00.445902 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.445816 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 04:28:00.446248 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.446227 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cml9v\"" Apr 21 04:28:00.448160 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.446596 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 04:28:00.454457 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.454425 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 04:28:00.455935 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.455912 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:28:00.466316 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466291 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/883ea867-3996-4c55-b3de-5f56c61c1997-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466399 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466399 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466371 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-config-volume\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466399 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466396 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466501 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466416 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466501 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gctst\" (UniqueName: \"kubernetes.io/projected/883ea867-3996-4c55-b3de-5f56c61c1997-kube-api-access-gctst\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466501 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466459 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466501 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466481 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466501 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466496 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/883ea867-3996-4c55-b3de-5f56c61c1997-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466715 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466521 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/883ea867-3996-4c55-b3de-5f56c61c1997-tls-assets\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466715 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/883ea867-3996-4c55-b3de-5f56c61c1997-config-out\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466715 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-web-config\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.466715 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.466628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/883ea867-3996-4c55-b3de-5f56c61c1997-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.512424 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.512391 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1703f6b0-c1fc-4050-93a7-2ba10c81efc0" path="/var/lib/kubelet/pods/1703f6b0-c1fc-4050-93a7-2ba10c81efc0/volumes" Apr 21 04:28:00.567997 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.567954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/883ea867-3996-4c55-b3de-5f56c61c1997-config-out\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568178 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-web-config\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568178 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/883ea867-3996-4c55-b3de-5f56c61c1997-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568178 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/883ea867-3996-4c55-b3de-5f56c61c1997-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568178 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568178 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-config-volume\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568449 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568408 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568508 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568461 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568508 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568474 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/883ea867-3996-4c55-b3de-5f56c61c1997-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568508 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gctst\" (UniqueName: \"kubernetes.io/projected/883ea867-3996-4c55-b3de-5f56c61c1997-kube-api-access-gctst\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568654 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568654 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568654 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/883ea867-3996-4c55-b3de-5f56c61c1997-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.568654 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.568644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/883ea867-3996-4c55-b3de-5f56c61c1997-tls-assets\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.569282 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.569251 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/883ea867-3996-4c55-b3de-5f56c61c1997-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.569577 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.569555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/883ea867-3996-4c55-b3de-5f56c61c1997-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.571017 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.570968 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/883ea867-3996-4c55-b3de-5f56c61c1997-config-out\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.571155 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.571131 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-web-config\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.571260 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.571131 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.571610 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.571590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.571858 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.571834 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.571936 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.571922 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/883ea867-3996-4c55-b3de-5f56c61c1997-tls-assets\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.572064 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.572044 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-config-volume\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.572660 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.572645 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.572927 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.572912 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/883ea867-3996-4c55-b3de-5f56c61c1997-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.580621 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.580600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gctst\" (UniqueName: \"kubernetes.io/projected/883ea867-3996-4c55-b3de-5f56c61c1997-kube-api-access-gctst\") pod \"alertmanager-main-0\" (UID: \"883ea867-3996-4c55-b3de-5f56c61c1997\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.759109 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.759012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:28:00.890154 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:00.890128 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:28:00.892256 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:28:00.892217 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod883ea867_3996_4c55_b3de_5f56c61c1997.slice/crio-3c2c36e363309c7ce02abfe9916d0ab38349a769085db7c67c3424cc8363fdf7 WatchSource:0}: Error finding container 3c2c36e363309c7ce02abfe9916d0ab38349a769085db7c67c3424cc8363fdf7: Status 404 returned error can't find the container with id 3c2c36e363309c7ce02abfe9916d0ab38349a769085db7c67c3424cc8363fdf7 Apr 21 04:28:01.390556 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:01.390515 2579 generic.go:358] "Generic (PLEG): container finished" podID="883ea867-3996-4c55-b3de-5f56c61c1997" containerID="50cc5db0c6b7849d17c03588d09f3c99a807906cb5542373c217f85588aca80c" exitCode=0 Apr 21 04:28:01.390728 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:01.390602 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"883ea867-3996-4c55-b3de-5f56c61c1997","Type":"ContainerDied","Data":"50cc5db0c6b7849d17c03588d09f3c99a807906cb5542373c217f85588aca80c"} Apr 21 04:28:01.390728 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:01.390637 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"883ea867-3996-4c55-b3de-5f56c61c1997","Type":"ContainerStarted","Data":"3c2c36e363309c7ce02abfe9916d0ab38349a769085db7c67c3424cc8363fdf7"} Apr 21 04:28:02.397701 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.397670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"883ea867-3996-4c55-b3de-5f56c61c1997","Type":"ContainerStarted","Data":"4e4da6f9b88b7c2e6092534d03aed41f44c0bf4d0121893224ed07c05b0666af"} Apr 21 04:28:02.397701 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.397703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"883ea867-3996-4c55-b3de-5f56c61c1997","Type":"ContainerStarted","Data":"e9a6a52b42aa850c4f2ffee5f260401ad5cf09ac6d5ec0773222a72fb7869966"} Apr 21 04:28:02.398121 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.397714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"883ea867-3996-4c55-b3de-5f56c61c1997","Type":"ContainerStarted","Data":"f3130865005b055aeacf97bc5478975fb735501288218559f6b79dd07f6b3b4e"} Apr 21 04:28:02.398121 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.397722 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"883ea867-3996-4c55-b3de-5f56c61c1997","Type":"ContainerStarted","Data":"2871cf4662cd0e833c3be73e75c108291bf22dc7102a34f24ec1cb13d23d5144"} Apr 21 04:28:02.398121 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.397730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"883ea867-3996-4c55-b3de-5f56c61c1997","Type":"ContainerStarted","Data":"aeb7afeecd21110d7abff363d060b8e98ac4c5eebfda2f5dd219695d06a6135d"} Apr 21 04:28:02.398121 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.397738 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"883ea867-3996-4c55-b3de-5f56c61c1997","Type":"ContainerStarted","Data":"ad16ddf3e50256d0332bd2d8a978d2d7e3a13b38167dc000dba04c2cf1e330e6"} Apr 21 04:28:02.422521 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.422467 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.422450642 podStartE2EDuration="2.422450642s" podCreationTimestamp="2026-04-21 04:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:28:02.420390272 +0000 UTC m=+234.415787896" watchObservedRunningTime="2026-04-21 04:28:02.422450642 +0000 UTC m=+234.417848267" Apr 21 04:28:02.545962 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.545923 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg"] Apr 21 04:28:02.549489 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.549470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.553328 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.553291 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 04:28:02.553441 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.553383 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 04:28:02.553441 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.553410 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 04:28:02.553683 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.553667 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 04:28:02.553683 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.553674 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-8r4wx\"" Apr 21 04:28:02.553803 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.553717 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 04:28:02.559779 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.559755 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 04:28:02.564342 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.564318 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg"] Apr 21 04:28:02.588344 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.588307 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j5fr\" (UniqueName: \"kubernetes.io/projected/9327c526-d54d-4f72-83db-46a507db3823-kube-api-access-4j5fr\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.588344 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.588343 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-telemeter-client-tls\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.588546 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.588420 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-metrics-client-ca\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.588546 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.588446 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.588546 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.588487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-federate-client-tls\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.588698 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.588558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-serving-certs-ca-bundle\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.588698 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.588583 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-secret-telemeter-client\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.588698 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.588609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.689700 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.689602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-metrics-client-ca\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.689700 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.689661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.689904 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.689769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-federate-client-tls\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.689904 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.689814 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-serving-certs-ca-bundle\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.689904 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.689847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-secret-telemeter-client\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.689904 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.689870 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.690142 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.689915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4j5fr\" (UniqueName: \"kubernetes.io/projected/9327c526-d54d-4f72-83db-46a507db3823-kube-api-access-4j5fr\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.690142 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.689941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-telemeter-client-tls\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.690553 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.690522 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-metrics-client-ca\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.690657 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.690534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-serving-certs-ca-bundle\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.691200 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.691176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9327c526-d54d-4f72-83db-46a507db3823-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.692398 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.692378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.692522 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.692506 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-telemeter-client-tls\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.692617 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.692602 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-secret-telemeter-client\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.692732 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.692712 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9327c526-d54d-4f72-83db-46a507db3823-federate-client-tls\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.697480 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.697458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j5fr\" (UniqueName: \"kubernetes.io/projected/9327c526-d54d-4f72-83db-46a507db3823-kube-api-access-4j5fr\") pod \"telemeter-client-7cbbf7d57f-m6rcg\" (UID: \"9327c526-d54d-4f72-83db-46a507db3823\") " pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.859943 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.859900 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" Apr 21 04:28:02.983129 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:02.983065 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg"] Apr 21 04:28:02.985749 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:28:02.985714 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9327c526_d54d_4f72_83db_46a507db3823.slice/crio-db5bde1db4ff19599a530b5e9548b38951c17d68d9b62e4e7900e6f94019ace4 WatchSource:0}: Error finding container db5bde1db4ff19599a530b5e9548b38951c17d68d9b62e4e7900e6f94019ace4: Status 404 returned error can't find the container with id db5bde1db4ff19599a530b5e9548b38951c17d68d9b62e4e7900e6f94019ace4 Apr 21 04:28:03.402044 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:03.402005 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" event={"ID":"9327c526-d54d-4f72-83db-46a507db3823","Type":"ContainerStarted","Data":"db5bde1db4ff19599a530b5e9548b38951c17d68d9b62e4e7900e6f94019ace4"} Apr 21 04:28:05.410993 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:05.410954 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" event={"ID":"9327c526-d54d-4f72-83db-46a507db3823","Type":"ContainerStarted","Data":"1f1c7e96e85f48e3ed017d191df3b2018b1c46537f9b4a954bec8fa81fcfca08"} Apr 21 04:28:05.411415 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:05.411014 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" event={"ID":"9327c526-d54d-4f72-83db-46a507db3823","Type":"ContainerStarted","Data":"1532983b47007830b3d3c369e5a37e3ed6cb601a8e6a0fb7318053034d4aa591"} Apr 21 04:28:05.411415 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:05.411028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" event={"ID":"9327c526-d54d-4f72-83db-46a507db3823","Type":"ContainerStarted","Data":"c72933999b2f360c68468ed4c47130bf3239d026d7a8f3409563bfd9282a5891"} Apr 21 04:28:05.437485 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:05.437435 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7cbbf7d57f-m6rcg" podStartSLOduration=1.555032893 podStartE2EDuration="3.43742131s" podCreationTimestamp="2026-04-21 04:28:02 +0000 UTC" firstStartedPulling="2026-04-21 04:28:02.987650178 +0000 UTC m=+234.983047784" lastFinishedPulling="2026-04-21 04:28:04.87003859 +0000 UTC m=+236.865436201" observedRunningTime="2026-04-21 04:28:05.436085243 +0000 UTC m=+237.431482865" watchObservedRunningTime="2026-04-21 04:28:05.43742131 +0000 UTC m=+237.432818936" Apr 21 04:28:05.977239 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:05.977202 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67ff68cbb8-nzrwv"] Apr 21 04:28:05.981317 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:05.981292 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:05.991721 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:05.991695 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67ff68cbb8-nzrwv"] Apr 21 04:28:06.021305 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.021274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-oauth-config\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.021463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.021310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-config\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.021463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.021439 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-oauth-serving-cert\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.021558 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.021468 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgtv\" (UniqueName: \"kubernetes.io/projected/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-kube-api-access-rkgtv\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.021558 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.021487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-serving-cert\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.021558 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.021507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-trusted-ca-bundle\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.021661 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.021592 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-service-ca\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.122444 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.122407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-oauth-serving-cert\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.122444 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.122443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgtv\" (UniqueName: \"kubernetes.io/projected/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-kube-api-access-rkgtv\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.122697 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.122461 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-serving-cert\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.122697 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.122481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-trusted-ca-bundle\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.122697 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.122530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-service-ca\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.122697 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.122553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-oauth-config\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.122697 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.122579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-config\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.123347 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.123324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-oauth-serving-cert\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.123462 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.123354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-service-ca\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.123462 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.123410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-config\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.123542 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.123530 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-trusted-ca-bundle\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.125436 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.125413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-oauth-config\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.125554 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.125534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-serving-cert\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.133742 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.133718 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgtv\" (UniqueName: \"kubernetes.io/projected/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-kube-api-access-rkgtv\") pod \"console-67ff68cbb8-nzrwv\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.292490 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.292403 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:06.414411 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:06.414388 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67ff68cbb8-nzrwv"] Apr 21 04:28:06.416887 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:28:06.416859 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3b11cd2_474f_4e57_8afd_17f92a90d4f4.slice/crio-ae3447c45280395a72c3293607439d3e64fcd85accb80e9b6ec190c955661185 WatchSource:0}: Error finding container ae3447c45280395a72c3293607439d3e64fcd85accb80e9b6ec190c955661185: Status 404 returned error can't find the container with id ae3447c45280395a72c3293607439d3e64fcd85accb80e9b6ec190c955661185 Apr 21 04:28:07.419364 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:07.419322 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67ff68cbb8-nzrwv" event={"ID":"f3b11cd2-474f-4e57-8afd-17f92a90d4f4","Type":"ContainerStarted","Data":"ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2"} Apr 21 04:28:07.419364 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:07.419364 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67ff68cbb8-nzrwv" event={"ID":"f3b11cd2-474f-4e57-8afd-17f92a90d4f4","Type":"ContainerStarted","Data":"ae3447c45280395a72c3293607439d3e64fcd85accb80e9b6ec190c955661185"} Apr 21 04:28:07.436907 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:07.436837 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67ff68cbb8-nzrwv" podStartSLOduration=2.4368198039999998 podStartE2EDuration="2.436819804s" podCreationTimestamp="2026-04-21 04:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:28:07.434534729 +0000 UTC m=+239.429932366" watchObservedRunningTime="2026-04-21 04:28:07.436819804 +0000 UTC m=+239.432217429" Apr 21 04:28:16.292580 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:16.292530 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:16.292580 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:16.292579 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:16.297484 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:16.297461 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:16.452685 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:16.452656 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:28:16.496259 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:16.496223 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5876849554-xldwj"] Apr 21 04:28:41.521030 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.520946 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5876849554-xldwj" podUID="d0847388-26b8-46fe-a68b-b593a7bbac48" containerName="console" containerID="cri-o://eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c" gracePeriod=15 Apr 21 04:28:41.765601 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.765579 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5876849554-xldwj_d0847388-26b8-46fe-a68b-b593a7bbac48/console/0.log" Apr 21 04:28:41.765702 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.765641 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5876849554-xldwj" Apr 21 04:28:41.842820 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.842795 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-trusted-ca-bundle\") pod \"d0847388-26b8-46fe-a68b-b593a7bbac48\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " Apr 21 04:28:41.843021 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.842839 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-service-ca\") pod \"d0847388-26b8-46fe-a68b-b593a7bbac48\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " Apr 21 04:28:41.843021 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.842873 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-oauth-config\") pod \"d0847388-26b8-46fe-a68b-b593a7bbac48\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " Apr 21 04:28:41.843106 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843027 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-serving-cert\") pod \"d0847388-26b8-46fe-a68b-b593a7bbac48\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " Apr 21 04:28:41.843106 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843071 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qgh9\" (UniqueName: \"kubernetes.io/projected/d0847388-26b8-46fe-a68b-b593a7bbac48-kube-api-access-5qgh9\") pod \"d0847388-26b8-46fe-a68b-b593a7bbac48\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " Apr 21 04:28:41.843106 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843101 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-oauth-serving-cert\") pod \"d0847388-26b8-46fe-a68b-b593a7bbac48\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " Apr 21 04:28:41.843247 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843160 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-console-config\") pod \"d0847388-26b8-46fe-a68b-b593a7bbac48\" (UID: \"d0847388-26b8-46fe-a68b-b593a7bbac48\") " Apr 21 04:28:41.843402 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843361 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-service-ca" (OuterVolumeSpecName: "service-ca") pod "d0847388-26b8-46fe-a68b-b593a7bbac48" (UID: "d0847388-26b8-46fe-a68b-b593a7bbac48"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:41.843539 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843495 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d0847388-26b8-46fe-a68b-b593a7bbac48" (UID: "d0847388-26b8-46fe-a68b-b593a7bbac48"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:41.843539 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843513 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-console-config" (OuterVolumeSpecName: "console-config") pod "d0847388-26b8-46fe-a68b-b593a7bbac48" (UID: "d0847388-26b8-46fe-a68b-b593a7bbac48"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:41.843658 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843530 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d0847388-26b8-46fe-a68b-b593a7bbac48" (UID: "d0847388-26b8-46fe-a68b-b593a7bbac48"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:41.843763 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843746 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-oauth-serving-cert\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:28:41.843824 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843766 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-console-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:28:41.843824 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843776 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-trusted-ca-bundle\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:28:41.843824 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.843786 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0847388-26b8-46fe-a68b-b593a7bbac48-service-ca\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:28:41.845232 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.845198 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d0847388-26b8-46fe-a68b-b593a7bbac48" (UID: "d0847388-26b8-46fe-a68b-b593a7bbac48"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:41.845349 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.845304 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0847388-26b8-46fe-a68b-b593a7bbac48-kube-api-access-5qgh9" (OuterVolumeSpecName: "kube-api-access-5qgh9") pod "d0847388-26b8-46fe-a68b-b593a7bbac48" (UID: "d0847388-26b8-46fe-a68b-b593a7bbac48"). InnerVolumeSpecName "kube-api-access-5qgh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:28:41.845349 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.845321 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d0847388-26b8-46fe-a68b-b593a7bbac48" (UID: "d0847388-26b8-46fe-a68b-b593a7bbac48"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:41.944913 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.944874 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-serving-cert\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:28:41.944913 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.944905 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5qgh9\" (UniqueName: \"kubernetes.io/projected/d0847388-26b8-46fe-a68b-b593a7bbac48-kube-api-access-5qgh9\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:28:41.944913 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:41.944918 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0847388-26b8-46fe-a68b-b593a7bbac48-console-oauth-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:28:42.528269 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.528241 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5876849554-xldwj_d0847388-26b8-46fe-a68b-b593a7bbac48/console/0.log" Apr 21 04:28:42.528592 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.528280 2579 generic.go:358] "Generic (PLEG): container finished" podID="d0847388-26b8-46fe-a68b-b593a7bbac48" containerID="eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c" exitCode=2 Apr 21 04:28:42.528592 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.528310 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5876849554-xldwj" event={"ID":"d0847388-26b8-46fe-a68b-b593a7bbac48","Type":"ContainerDied","Data":"eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c"} Apr 21 04:28:42.528592 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.528331 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5876849554-xldwj" event={"ID":"d0847388-26b8-46fe-a68b-b593a7bbac48","Type":"ContainerDied","Data":"b75a05e4d4bf6181aea4188e59e21815541b1061a8a28667d4ddc38ca9b48f43"} Apr 21 04:28:42.528592 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.528341 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5876849554-xldwj" Apr 21 04:28:42.528592 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.528346 2579 scope.go:117] "RemoveContainer" containerID="eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c" Apr 21 04:28:42.536900 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.536881 2579 scope.go:117] "RemoveContainer" containerID="eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c" Apr 21 04:28:42.537163 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:28:42.537144 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c\": container with ID starting with eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c not found: ID does not exist" containerID="eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c" Apr 21 04:28:42.537211 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.537172 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c"} err="failed to get container status \"eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c\": rpc error: code = NotFound desc = could not find container \"eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c\": container with ID starting with eb4b5863499130db75f28a4b415e58bfc16dd7ca9581da139221b85bccf9d74c not found: ID does not exist" Apr 21 04:28:42.546029 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.546004 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5876849554-xldwj"] Apr 21 04:28:42.549682 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:42.549660 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5876849554-xldwj"] Apr 21 04:28:44.517034 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:28:44.516971 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0847388-26b8-46fe-a68b-b593a7bbac48" path="/var/lib/kubelet/pods/d0847388-26b8-46fe-a68b-b593a7bbac48/volumes" Apr 21 04:29:08.405963 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:08.405935 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:29:08.406468 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:08.406065 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:29:08.409862 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:08.409839 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:29:08.410065 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:08.410046 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:29:08.415735 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:08.415712 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:29:28.560511 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.560480 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-569bf775d7-5kphr"] Apr 21 04:29:28.562917 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.560785 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0847388-26b8-46fe-a68b-b593a7bbac48" containerName="console" Apr 21 04:29:28.562917 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.560796 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0847388-26b8-46fe-a68b-b593a7bbac48" containerName="console" Apr 21 04:29:28.562917 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.560859 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0847388-26b8-46fe-a68b-b593a7bbac48" containerName="console" Apr 21 04:29:28.563844 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.563824 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.573655 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.573632 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-569bf775d7-5kphr"] Apr 21 04:29:28.739851 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.739806 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-service-ca\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.740070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.739864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-config\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.740070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.739936 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-trusted-ca-bundle\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.740070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.739961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-serving-cert\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.740070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.739976 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-oauth-serving-cert\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.740070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.740058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpgkx\" (UniqueName: \"kubernetes.io/projected/efcb6cee-ac1d-413a-a627-9420a7b72b5b-kube-api-access-qpgkx\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.740253 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.740097 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-oauth-config\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.841442 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.841410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-config\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.841581 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.841465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-trusted-ca-bundle\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.841581 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.841484 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-serving-cert\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.841581 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.841507 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-oauth-serving-cert\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.841703 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.841621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpgkx\" (UniqueName: \"kubernetes.io/projected/efcb6cee-ac1d-413a-a627-9420a7b72b5b-kube-api-access-qpgkx\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.841792 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.841773 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-oauth-config\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.841860 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.841844 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-service-ca\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.842169 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.842141 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-config\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.842363 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.842207 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-oauth-serving-cert\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.842363 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.842324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-trusted-ca-bundle\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.842446 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.842432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-service-ca\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.844047 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.844014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-serving-cert\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.844149 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.844132 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-oauth-config\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.849891 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.849870 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpgkx\" (UniqueName: \"kubernetes.io/projected/efcb6cee-ac1d-413a-a627-9420a7b72b5b-kube-api-access-qpgkx\") pod \"console-569bf775d7-5kphr\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.874134 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.874101 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:28.998344 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:28.998318 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-569bf775d7-5kphr"] Apr 21 04:29:29.000921 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:29:29.000890 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefcb6cee_ac1d_413a_a627_9420a7b72b5b.slice/crio-389554b53e078364cdd2011b35b55aef5089c2e32a9a69b33f56a37ede1e989c WatchSource:0}: Error finding container 389554b53e078364cdd2011b35b55aef5089c2e32a9a69b33f56a37ede1e989c: Status 404 returned error can't find the container with id 389554b53e078364cdd2011b35b55aef5089c2e32a9a69b33f56a37ede1e989c Apr 21 04:29:29.002737 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:29.002722 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:29:29.665287 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:29.665254 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-569bf775d7-5kphr" event={"ID":"efcb6cee-ac1d-413a-a627-9420a7b72b5b","Type":"ContainerStarted","Data":"529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef"} Apr 21 04:29:29.665287 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:29.665287 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-569bf775d7-5kphr" event={"ID":"efcb6cee-ac1d-413a-a627-9420a7b72b5b","Type":"ContainerStarted","Data":"389554b53e078364cdd2011b35b55aef5089c2e32a9a69b33f56a37ede1e989c"} Apr 21 04:29:29.683154 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:29.683089 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-569bf775d7-5kphr" podStartSLOduration=1.68307313 podStartE2EDuration="1.68307313s" podCreationTimestamp="2026-04-21 04:29:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:29:29.68146006 +0000 UTC m=+321.676857684" watchObservedRunningTime="2026-04-21 04:29:29.68307313 +0000 UTC m=+321.678470754" Apr 21 04:29:38.874335 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:38.874289 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:38.874335 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:38.874331 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:38.879100 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:38.879075 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:39.701477 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:39.701447 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:29:39.744360 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:39.744322 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67ff68cbb8-nzrwv"] Apr 21 04:29:46.934475 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:46.934435 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-brbpw"] Apr 21 04:29:46.937961 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:46.937938 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:46.940207 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:46.940182 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:29:46.944271 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:46.944240 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-brbpw"] Apr 21 04:29:46.991178 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:46.991141 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/80c0f40e-55cf-4d32-b714-e175b15308f4-dbus\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:46.991353 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:46.991205 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80c0f40e-55cf-4d32-b714-e175b15308f4-original-pull-secret\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:46.991353 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:46.991241 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/80c0f40e-55cf-4d32-b714-e175b15308f4-kubelet-config\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:47.092490 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.092451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80c0f40e-55cf-4d32-b714-e175b15308f4-original-pull-secret\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:47.092638 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.092514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/80c0f40e-55cf-4d32-b714-e175b15308f4-kubelet-config\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:47.092638 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.092594 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/80c0f40e-55cf-4d32-b714-e175b15308f4-dbus\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:47.092717 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.092665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/80c0f40e-55cf-4d32-b714-e175b15308f4-kubelet-config\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:47.092783 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.092766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/80c0f40e-55cf-4d32-b714-e175b15308f4-dbus\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:47.094755 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.094739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80c0f40e-55cf-4d32-b714-e175b15308f4-original-pull-secret\") pod \"global-pull-secret-syncer-brbpw\" (UID: \"80c0f40e-55cf-4d32-b714-e175b15308f4\") " pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:47.248437 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.248349 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brbpw" Apr 21 04:29:47.369631 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.369431 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-brbpw"] Apr 21 04:29:47.371938 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:29:47.371897 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c0f40e_55cf_4d32_b714_e175b15308f4.slice/crio-c15da36bd1e6ccfc81e34e3352c22b8b0203a530981e0c414092ff00109afe6f WatchSource:0}: Error finding container c15da36bd1e6ccfc81e34e3352c22b8b0203a530981e0c414092ff00109afe6f: Status 404 returned error can't find the container with id c15da36bd1e6ccfc81e34e3352c22b8b0203a530981e0c414092ff00109afe6f Apr 21 04:29:47.723294 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:47.723256 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-brbpw" event={"ID":"80c0f40e-55cf-4d32-b714-e175b15308f4","Type":"ContainerStarted","Data":"c15da36bd1e6ccfc81e34e3352c22b8b0203a530981e0c414092ff00109afe6f"} Apr 21 04:29:52.739815 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:52.739775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-brbpw" event={"ID":"80c0f40e-55cf-4d32-b714-e175b15308f4","Type":"ContainerStarted","Data":"b0a7d654eda1986b25041d4c4401bef0029ea1ed71b44d54544f4c0713fe8572"} Apr 21 04:29:52.754564 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:29:52.754516 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-brbpw" podStartSLOduration=2.150400961 podStartE2EDuration="6.754499253s" podCreationTimestamp="2026-04-21 04:29:46 +0000 UTC" firstStartedPulling="2026-04-21 04:29:47.373590667 +0000 UTC m=+339.368988270" lastFinishedPulling="2026-04-21 04:29:51.977688958 +0000 UTC m=+343.973086562" observedRunningTime="2026-04-21 04:29:52.753374945 +0000 UTC m=+344.748772571" watchObservedRunningTime="2026-04-21 04:29:52.754499253 +0000 UTC m=+344.749896989" Apr 21 04:30:04.766360 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:04.766271 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67ff68cbb8-nzrwv" podUID="f3b11cd2-474f-4e57-8afd-17f92a90d4f4" containerName="console" containerID="cri-o://ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2" gracePeriod=15 Apr 21 04:30:05.002193 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.002169 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67ff68cbb8-nzrwv_f3b11cd2-474f-4e57-8afd-17f92a90d4f4/console/0.log" Apr 21 04:30:05.002331 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.002233 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:30:05.158646 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.158617 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgtv\" (UniqueName: \"kubernetes.io/projected/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-kube-api-access-rkgtv\") pod \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " Apr 21 04:30:05.158813 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.158673 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-oauth-config\") pod \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " Apr 21 04:30:05.158813 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.158697 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-service-ca\") pod \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " Apr 21 04:30:05.158813 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.158750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-oauth-serving-cert\") pod \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " Apr 21 04:30:05.158813 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.158806 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-serving-cert\") pod \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " Apr 21 04:30:05.158976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.158830 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-trusted-ca-bundle\") pod \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " Apr 21 04:30:05.158976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.158893 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-config\") pod \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\" (UID: \"f3b11cd2-474f-4e57-8afd-17f92a90d4f4\") " Apr 21 04:30:05.159163 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.159129 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-service-ca" (OuterVolumeSpecName: "service-ca") pod "f3b11cd2-474f-4e57-8afd-17f92a90d4f4" (UID: "f3b11cd2-474f-4e57-8afd-17f92a90d4f4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:30:05.159289 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.159218 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f3b11cd2-474f-4e57-8afd-17f92a90d4f4" (UID: "f3b11cd2-474f-4e57-8afd-17f92a90d4f4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:30:05.159289 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.159249 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-service-ca\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:30:05.159426 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.159397 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f3b11cd2-474f-4e57-8afd-17f92a90d4f4" (UID: "f3b11cd2-474f-4e57-8afd-17f92a90d4f4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:30:05.159490 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.159417 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-config" (OuterVolumeSpecName: "console-config") pod "f3b11cd2-474f-4e57-8afd-17f92a90d4f4" (UID: "f3b11cd2-474f-4e57-8afd-17f92a90d4f4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:30:05.160953 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.160923 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-kube-api-access-rkgtv" (OuterVolumeSpecName: "kube-api-access-rkgtv") pod "f3b11cd2-474f-4e57-8afd-17f92a90d4f4" (UID: "f3b11cd2-474f-4e57-8afd-17f92a90d4f4"). InnerVolumeSpecName "kube-api-access-rkgtv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:30:05.161228 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.161197 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f3b11cd2-474f-4e57-8afd-17f92a90d4f4" (UID: "f3b11cd2-474f-4e57-8afd-17f92a90d4f4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:30:05.161332 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.161312 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f3b11cd2-474f-4e57-8afd-17f92a90d4f4" (UID: "f3b11cd2-474f-4e57-8afd-17f92a90d4f4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:30:05.260612 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.260566 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rkgtv\" (UniqueName: \"kubernetes.io/projected/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-kube-api-access-rkgtv\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:30:05.260612 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.260607 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-oauth-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:30:05.260612 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.260620 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-oauth-serving-cert\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:30:05.260860 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.260633 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-serving-cert\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:30:05.260860 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.260646 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-trusted-ca-bundle\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:30:05.260860 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.260658 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3b11cd2-474f-4e57-8afd-17f92a90d4f4-console-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:30:05.781057 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.781025 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67ff68cbb8-nzrwv_f3b11cd2-474f-4e57-8afd-17f92a90d4f4/console/0.log" Apr 21 04:30:05.781512 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.781072 2579 generic.go:358] "Generic (PLEG): container finished" podID="f3b11cd2-474f-4e57-8afd-17f92a90d4f4" containerID="ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2" exitCode=2 Apr 21 04:30:05.781512 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.781143 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67ff68cbb8-nzrwv" Apr 21 04:30:05.781512 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.781145 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67ff68cbb8-nzrwv" event={"ID":"f3b11cd2-474f-4e57-8afd-17f92a90d4f4","Type":"ContainerDied","Data":"ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2"} Apr 21 04:30:05.781512 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.781251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67ff68cbb8-nzrwv" event={"ID":"f3b11cd2-474f-4e57-8afd-17f92a90d4f4","Type":"ContainerDied","Data":"ae3447c45280395a72c3293607439d3e64fcd85accb80e9b6ec190c955661185"} Apr 21 04:30:05.781512 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.781272 2579 scope.go:117] "RemoveContainer" containerID="ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2" Apr 21 04:30:05.789814 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.789791 2579 scope.go:117] "RemoveContainer" containerID="ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2" Apr 21 04:30:05.790108 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:30:05.790088 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2\": container with ID starting with ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2 not found: ID does not exist" containerID="ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2" Apr 21 04:30:05.790168 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.790115 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2"} err="failed to get container status \"ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2\": rpc error: code = NotFound desc = could not find container \"ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2\": container with ID starting with ec0a53131575a467c8f1875b350ff6fd958821f1bde059ef9ec97b4e27b313b2 not found: ID does not exist" Apr 21 04:30:05.803154 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.803123 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67ff68cbb8-nzrwv"] Apr 21 04:30:05.808630 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:05.808603 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67ff68cbb8-nzrwv"] Apr 21 04:30:06.512629 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:06.512594 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b11cd2-474f-4e57-8afd-17f92a90d4f4" path="/var/lib/kubelet/pods/f3b11cd2-474f-4e57-8afd-17f92a90d4f4/volumes" Apr 21 04:30:26.577047 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.577008 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn"] Apr 21 04:30:26.577617 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.577482 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3b11cd2-474f-4e57-8afd-17f92a90d4f4" containerName="console" Apr 21 04:30:26.577617 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.577502 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b11cd2-474f-4e57-8afd-17f92a90d4f4" containerName="console" Apr 21 04:30:26.577617 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.577618 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3b11cd2-474f-4e57-8afd-17f92a90d4f4" containerName="console" Apr 21 04:30:26.580307 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.580285 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" Apr 21 04:30:26.582496 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.582467 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:30:26.582704 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.582687 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-slvrx\"" Apr 21 04:30:26.582760 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.582702 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 04:30:26.591293 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.591270 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn"] Apr 21 04:30:26.634279 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.634242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj7td\" (UniqueName: \"kubernetes.io/projected/25af276e-8b50-4330-bb08-67231e70bb4d-kube-api-access-kj7td\") pod \"cert-manager-operator-controller-manager-54b9655956-lgsjn\" (UID: \"25af276e-8b50-4330-bb08-67231e70bb4d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" Apr 21 04:30:26.634448 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.634306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/25af276e-8b50-4330-bb08-67231e70bb4d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-lgsjn\" (UID: \"25af276e-8b50-4330-bb08-67231e70bb4d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" Apr 21 04:30:26.734722 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.734685 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj7td\" (UniqueName: \"kubernetes.io/projected/25af276e-8b50-4330-bb08-67231e70bb4d-kube-api-access-kj7td\") pod \"cert-manager-operator-controller-manager-54b9655956-lgsjn\" (UID: \"25af276e-8b50-4330-bb08-67231e70bb4d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" Apr 21 04:30:26.734900 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.734822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/25af276e-8b50-4330-bb08-67231e70bb4d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-lgsjn\" (UID: \"25af276e-8b50-4330-bb08-67231e70bb4d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" Apr 21 04:30:26.735232 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.735215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/25af276e-8b50-4330-bb08-67231e70bb4d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-lgsjn\" (UID: \"25af276e-8b50-4330-bb08-67231e70bb4d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" Apr 21 04:30:26.743259 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.743231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj7td\" (UniqueName: \"kubernetes.io/projected/25af276e-8b50-4330-bb08-67231e70bb4d-kube-api-access-kj7td\") pod \"cert-manager-operator-controller-manager-54b9655956-lgsjn\" (UID: \"25af276e-8b50-4330-bb08-67231e70bb4d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" Apr 21 04:30:26.890333 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:26.890298 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" Apr 21 04:30:27.013341 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:27.013308 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn"] Apr 21 04:30:27.017436 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:30:27.017404 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25af276e_8b50_4330_bb08_67231e70bb4d.slice/crio-438b3b1cb3cbee83b0c5604b9f9c44fa3129bc17bcbc7b97a18181c486c5ecb3 WatchSource:0}: Error finding container 438b3b1cb3cbee83b0c5604b9f9c44fa3129bc17bcbc7b97a18181c486c5ecb3: Status 404 returned error can't find the container with id 438b3b1cb3cbee83b0c5604b9f9c44fa3129bc17bcbc7b97a18181c486c5ecb3 Apr 21 04:30:27.855598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:27.855554 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" event={"ID":"25af276e-8b50-4330-bb08-67231e70bb4d","Type":"ContainerStarted","Data":"438b3b1cb3cbee83b0c5604b9f9c44fa3129bc17bcbc7b97a18181c486c5ecb3"} Apr 21 04:30:29.864443 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:29.864407 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" event={"ID":"25af276e-8b50-4330-bb08-67231e70bb4d","Type":"ContainerStarted","Data":"4c21bc2b7c6ed1ffef55d18512dce4624e5198c918c8ee9a627243f8777a72a6"} Apr 21 04:30:29.884661 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:29.884595 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-lgsjn" podStartSLOduration=1.383547994 podStartE2EDuration="3.884577005s" podCreationTimestamp="2026-04-21 04:30:26 +0000 UTC" firstStartedPulling="2026-04-21 04:30:27.020070066 +0000 UTC m=+379.015467669" lastFinishedPulling="2026-04-21 04:30:29.52109906 +0000 UTC m=+381.516496680" observedRunningTime="2026-04-21 04:30:29.882335726 +0000 UTC m=+381.877733353" watchObservedRunningTime="2026-04-21 04:30:29.884577005 +0000 UTC m=+381.879974633" Apr 21 04:30:31.945579 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:31.945541 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-bnnk7"] Apr 21 04:30:31.949199 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:31.949178 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:31.951645 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:31.951619 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 04:30:31.952446 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:31.952430 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-f5nhz\"" Apr 21 04:30:31.952560 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:31.952442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 04:30:31.955954 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:31.955924 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-bnnk7"] Apr 21 04:30:31.983991 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:31.983949 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0cc0418-0f77-4108-9554-819ce2143df5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-bnnk7\" (UID: \"a0cc0418-0f77-4108-9554-819ce2143df5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:31.984147 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:31.984029 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79sqp\" (UniqueName: \"kubernetes.io/projected/a0cc0418-0f77-4108-9554-819ce2143df5-kube-api-access-79sqp\") pod \"cert-manager-webhook-587ccfb98-bnnk7\" (UID: \"a0cc0418-0f77-4108-9554-819ce2143df5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:32.084645 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:32.084598 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79sqp\" (UniqueName: \"kubernetes.io/projected/a0cc0418-0f77-4108-9554-819ce2143df5-kube-api-access-79sqp\") pod \"cert-manager-webhook-587ccfb98-bnnk7\" (UID: \"a0cc0418-0f77-4108-9554-819ce2143df5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:32.084900 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:32.084688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0cc0418-0f77-4108-9554-819ce2143df5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-bnnk7\" (UID: \"a0cc0418-0f77-4108-9554-819ce2143df5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:32.092226 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:32.092198 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0cc0418-0f77-4108-9554-819ce2143df5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-bnnk7\" (UID: \"a0cc0418-0f77-4108-9554-819ce2143df5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:32.092362 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:32.092282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79sqp\" (UniqueName: \"kubernetes.io/projected/a0cc0418-0f77-4108-9554-819ce2143df5-kube-api-access-79sqp\") pod \"cert-manager-webhook-587ccfb98-bnnk7\" (UID: \"a0cc0418-0f77-4108-9554-819ce2143df5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:32.271895 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:32.271797 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:32.408303 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:32.408267 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-bnnk7"] Apr 21 04:30:32.411405 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:30:32.411368 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0cc0418_0f77_4108_9554_819ce2143df5.slice/crio-d67a9f1683705a41bae830b59e0a54d05a77770da11d7e0ae0aa35f6bc02d294 WatchSource:0}: Error finding container d67a9f1683705a41bae830b59e0a54d05a77770da11d7e0ae0aa35f6bc02d294: Status 404 returned error can't find the container with id d67a9f1683705a41bae830b59e0a54d05a77770da11d7e0ae0aa35f6bc02d294 Apr 21 04:30:32.875221 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:32.875182 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" event={"ID":"a0cc0418-0f77-4108-9554-819ce2143df5","Type":"ContainerStarted","Data":"d67a9f1683705a41bae830b59e0a54d05a77770da11d7e0ae0aa35f6bc02d294"} Apr 21 04:30:34.160704 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.160671 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8x6vx"] Apr 21 04:30:34.164455 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.164430 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" Apr 21 04:30:34.166642 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.166607 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-f8nd2\"" Apr 21 04:30:34.175090 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.174645 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8x6vx"] Apr 21 04:30:34.203669 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.203636 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62259759-860d-4d37-ad9b-a5962d632b33-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8x6vx\" (UID: \"62259759-860d-4d37-ad9b-a5962d632b33\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" Apr 21 04:30:34.203861 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.203700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ft6q\" (UniqueName: \"kubernetes.io/projected/62259759-860d-4d37-ad9b-a5962d632b33-kube-api-access-8ft6q\") pod \"cert-manager-cainjector-68b757865b-8x6vx\" (UID: \"62259759-860d-4d37-ad9b-a5962d632b33\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" Apr 21 04:30:34.304995 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.304940 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62259759-860d-4d37-ad9b-a5962d632b33-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8x6vx\" (UID: \"62259759-860d-4d37-ad9b-a5962d632b33\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" Apr 21 04:30:34.305190 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.305073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ft6q\" (UniqueName: \"kubernetes.io/projected/62259759-860d-4d37-ad9b-a5962d632b33-kube-api-access-8ft6q\") pod \"cert-manager-cainjector-68b757865b-8x6vx\" (UID: \"62259759-860d-4d37-ad9b-a5962d632b33\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" Apr 21 04:30:34.312904 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.312877 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62259759-860d-4d37-ad9b-a5962d632b33-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8x6vx\" (UID: \"62259759-860d-4d37-ad9b-a5962d632b33\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" Apr 21 04:30:34.313172 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.313148 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ft6q\" (UniqueName: \"kubernetes.io/projected/62259759-860d-4d37-ad9b-a5962d632b33-kube-api-access-8ft6q\") pod \"cert-manager-cainjector-68b757865b-8x6vx\" (UID: \"62259759-860d-4d37-ad9b-a5962d632b33\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" Apr 21 04:30:34.477358 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.477276 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" Apr 21 04:30:34.980682 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:34.980659 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8x6vx"] Apr 21 04:30:34.982657 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:30:34.982630 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62259759_860d_4d37_ad9b_a5962d632b33.slice/crio-1c653d638e39f24acc4060f6facbec846708762b7232dabf811c5a921e521f38 WatchSource:0}: Error finding container 1c653d638e39f24acc4060f6facbec846708762b7232dabf811c5a921e521f38: Status 404 returned error can't find the container with id 1c653d638e39f24acc4060f6facbec846708762b7232dabf811c5a921e521f38 Apr 21 04:30:35.886894 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:35.886856 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" event={"ID":"a0cc0418-0f77-4108-9554-819ce2143df5","Type":"ContainerStarted","Data":"a53f8a7cad214f338aa79280d3f6c0f2dfab09e1a17abe0322142aa23ddbf3e6"} Apr 21 04:30:35.887345 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:35.886925 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:30:35.888243 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:35.888222 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" event={"ID":"62259759-860d-4d37-ad9b-a5962d632b33","Type":"ContainerStarted","Data":"bdb92963604b86227a631a8bf2a8403b5464ba2f3cd4d2537f10d278633442da"} Apr 21 04:30:35.888339 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:35.888248 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" event={"ID":"62259759-860d-4d37-ad9b-a5962d632b33","Type":"ContainerStarted","Data":"1c653d638e39f24acc4060f6facbec846708762b7232dabf811c5a921e521f38"} Apr 21 04:30:35.907745 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:35.907693 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" podStartSLOduration=2.3965778 podStartE2EDuration="4.907681068s" podCreationTimestamp="2026-04-21 04:30:31 +0000 UTC" firstStartedPulling="2026-04-21 04:30:32.413330014 +0000 UTC m=+384.408727617" lastFinishedPulling="2026-04-21 04:30:34.924433282 +0000 UTC m=+386.919830885" observedRunningTime="2026-04-21 04:30:35.907509515 +0000 UTC m=+387.902907140" watchObservedRunningTime="2026-04-21 04:30:35.907681068 +0000 UTC m=+387.903078693" Apr 21 04:30:35.927889 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:35.927845 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-8x6vx" podStartSLOduration=1.927832363 podStartE2EDuration="1.927832363s" podCreationTimestamp="2026-04-21 04:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:30:35.926564238 +0000 UTC m=+387.921961876" watchObservedRunningTime="2026-04-21 04:30:35.927832363 +0000 UTC m=+387.923229988" Apr 21 04:30:41.892976 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:30:41.892942 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-bnnk7" Apr 21 04:31:07.822500 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:07.822467 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2"] Apr 21 04:31:07.829273 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:07.829251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:07.833216 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:07.833194 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 04:31:07.833528 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:07.833508 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zqtks\"" Apr 21 04:31:07.833915 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:07.833900 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 04:31:07.834022 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:07.833942 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 04:31:07.834097 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:07.834083 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 04:31:07.845173 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:07.845147 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2"] Apr 21 04:31:08.003968 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.003932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxg5q\" (UniqueName: \"kubernetes.io/projected/fb165b48-6a27-452d-9a33-c95836b74e28-kube-api-access-kxg5q\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.004190 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.004133 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb165b48-6a27-452d-9a33-c95836b74e28-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.004190 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.004184 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb165b48-6a27-452d-9a33-c95836b74e28-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.105308 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.105265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb165b48-6a27-452d-9a33-c95836b74e28-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.105491 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.105318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb165b48-6a27-452d-9a33-c95836b74e28-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.105491 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.105338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxg5q\" (UniqueName: \"kubernetes.io/projected/fb165b48-6a27-452d-9a33-c95836b74e28-kube-api-access-kxg5q\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.107881 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.107853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb165b48-6a27-452d-9a33-c95836b74e28-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.108016 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.107889 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb165b48-6a27-452d-9a33-c95836b74e28-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.116124 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.116094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxg5q\" (UniqueName: \"kubernetes.io/projected/fb165b48-6a27-452d-9a33-c95836b74e28-kube-api-access-kxg5q\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-prmx2\" (UID: \"fb165b48-6a27-452d-9a33-c95836b74e28\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.140174 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.140143 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:08.276707 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.276681 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2"] Apr 21 04:31:08.279994 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:31:08.279947 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb165b48_6a27_452d_9a33_c95836b74e28.slice/crio-0bccf6ec54dc380602476f8e5610cb9c37e2933616031fa2cadc85c9443ae371 WatchSource:0}: Error finding container 0bccf6ec54dc380602476f8e5610cb9c37e2933616031fa2cadc85c9443ae371: Status 404 returned error can't find the container with id 0bccf6ec54dc380602476f8e5610cb9c37e2933616031fa2cadc85c9443ae371 Apr 21 04:31:08.999745 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:08.999704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" event={"ID":"fb165b48-6a27-452d-9a33-c95836b74e28","Type":"ContainerStarted","Data":"0bccf6ec54dc380602476f8e5610cb9c37e2933616031fa2cadc85c9443ae371"} Apr 21 04:31:11.009128 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:11.009094 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" event={"ID":"fb165b48-6a27-452d-9a33-c95836b74e28","Type":"ContainerStarted","Data":"47d4a599336145ed1899accafb2e67e46493a161030846a9c5af62cbfd572600"} Apr 21 04:31:11.009515 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:11.009241 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:11.035273 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:11.035154 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" podStartSLOduration=1.531217281 podStartE2EDuration="4.035137587s" podCreationTimestamp="2026-04-21 04:31:07 +0000 UTC" firstStartedPulling="2026-04-21 04:31:08.281789207 +0000 UTC m=+420.277186810" lastFinishedPulling="2026-04-21 04:31:10.785709499 +0000 UTC m=+422.781107116" observedRunningTime="2026-04-21 04:31:11.033011295 +0000 UTC m=+423.028408919" watchObservedRunningTime="2026-04-21 04:31:11.035137587 +0000 UTC m=+423.030535211" Apr 21 04:31:22.015829 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:22.015796 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-prmx2" Apr 21 04:31:25.403146 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.403107 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6"] Apr 21 04:31:25.406411 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.406386 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.408838 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.408803 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 04:31:25.408838 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.408804 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 04:31:25.409859 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.409841 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-m778c\"" Apr 21 04:31:25.416883 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.416860 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6"] Apr 21 04:31:25.459161 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.459119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dd96334-68c0-4483-956b-259dfb8cb842-tmp\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.459339 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.459173 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8rfr\" (UniqueName: \"kubernetes.io/projected/4dd96334-68c0-4483-956b-259dfb8cb842-kube-api-access-b8rfr\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.459339 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.459246 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd96334-68c0-4483-956b-259dfb8cb842-tls-certs\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.559891 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.559852 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dd96334-68c0-4483-956b-259dfb8cb842-tmp\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.560099 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.559903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8rfr\" (UniqueName: \"kubernetes.io/projected/4dd96334-68c0-4483-956b-259dfb8cb842-kube-api-access-b8rfr\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.560099 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.559956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd96334-68c0-4483-956b-259dfb8cb842-tls-certs\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.562261 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.562231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dd96334-68c0-4483-956b-259dfb8cb842-tmp\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.562454 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.562433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd96334-68c0-4483-956b-259dfb8cb842-tls-certs\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.567315 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.567293 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8rfr\" (UniqueName: \"kubernetes.io/projected/4dd96334-68c0-4483-956b-259dfb8cb842-kube-api-access-b8rfr\") pod \"kube-auth-proxy-5f98864f9-b9mm6\" (UID: \"4dd96334-68c0-4483-956b-259dfb8cb842\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.717492 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.717397 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" Apr 21 04:31:25.843315 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:25.843290 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6"] Apr 21 04:31:25.845915 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:31:25.845875 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd96334_68c0_4483_956b_259dfb8cb842.slice/crio-2dec5fd34c846b8b089bbcea93df7997a3f4a8c6731d9c1dc13e7fe9def68dd9 WatchSource:0}: Error finding container 2dec5fd34c846b8b089bbcea93df7997a3f4a8c6731d9c1dc13e7fe9def68dd9: Status 404 returned error can't find the container with id 2dec5fd34c846b8b089bbcea93df7997a3f4a8c6731d9c1dc13e7fe9def68dd9 Apr 21 04:31:26.061319 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:26.061230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" event={"ID":"4dd96334-68c0-4483-956b-259dfb8cb842","Type":"ContainerStarted","Data":"2dec5fd34c846b8b089bbcea93df7997a3f4a8c6731d9c1dc13e7fe9def68dd9"} Apr 21 04:31:28.811788 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:28.811751 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7m6dg"] Apr 21 04:31:28.816863 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:28.816835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:28.820296 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:28.820273 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-vhskd\"" Apr 21 04:31:28.820520 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:28.820306 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 21 04:31:28.825177 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:28.825143 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7m6dg"] Apr 21 04:31:28.899529 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:28.899494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert\") pod \"odh-model-controller-858dbf95b8-7m6dg\" (UID: \"43abd5d6-298d-42b7-8e95-1fe108abadf7\") " pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:28.899710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:28.899552 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hcd\" (UniqueName: \"kubernetes.io/projected/43abd5d6-298d-42b7-8e95-1fe108abadf7-kube-api-access-l6hcd\") pod \"odh-model-controller-858dbf95b8-7m6dg\" (UID: \"43abd5d6-298d-42b7-8e95-1fe108abadf7\") " pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:29.000754 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:29.000710 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert\") pod \"odh-model-controller-858dbf95b8-7m6dg\" (UID: \"43abd5d6-298d-42b7-8e95-1fe108abadf7\") " pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:29.000943 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:29.000800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hcd\" (UniqueName: \"kubernetes.io/projected/43abd5d6-298d-42b7-8e95-1fe108abadf7-kube-api-access-l6hcd\") pod \"odh-model-controller-858dbf95b8-7m6dg\" (UID: \"43abd5d6-298d-42b7-8e95-1fe108abadf7\") " pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:29.000943 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:31:29.000885 2579 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 04:31:29.001084 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:31:29.000999 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert podName:43abd5d6-298d-42b7-8e95-1fe108abadf7 nodeName:}" failed. No retries permitted until 2026-04-21 04:31:29.50095717 +0000 UTC m=+441.496354796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert") pod "odh-model-controller-858dbf95b8-7m6dg" (UID: "43abd5d6-298d-42b7-8e95-1fe108abadf7") : secret "odh-model-controller-webhook-cert" not found Apr 21 04:31:29.010027 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:29.009966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hcd\" (UniqueName: \"kubernetes.io/projected/43abd5d6-298d-42b7-8e95-1fe108abadf7-kube-api-access-l6hcd\") pod \"odh-model-controller-858dbf95b8-7m6dg\" (UID: \"43abd5d6-298d-42b7-8e95-1fe108abadf7\") " pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:29.505726 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:29.505680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert\") pod \"odh-model-controller-858dbf95b8-7m6dg\" (UID: \"43abd5d6-298d-42b7-8e95-1fe108abadf7\") " pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:29.505908 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:31:29.505832 2579 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 04:31:29.505908 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:31:29.505898 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert podName:43abd5d6-298d-42b7-8e95-1fe108abadf7 nodeName:}" failed. No retries permitted until 2026-04-21 04:31:30.505879666 +0000 UTC m=+442.501277270 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert") pod "odh-model-controller-858dbf95b8-7m6dg" (UID: "43abd5d6-298d-42b7-8e95-1fe108abadf7") : secret "odh-model-controller-webhook-cert" not found Apr 21 04:31:30.077161 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:30.077119 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" event={"ID":"4dd96334-68c0-4483-956b-259dfb8cb842","Type":"ContainerStarted","Data":"c7fb3c4c5a1e55099f97d1e640e1732f7158807f701507f2d9e509f7bcd3bc8f"} Apr 21 04:31:30.093802 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:30.093747 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5f98864f9-b9mm6" podStartSLOduration=1.858708859 podStartE2EDuration="5.093728071s" podCreationTimestamp="2026-04-21 04:31:25 +0000 UTC" firstStartedPulling="2026-04-21 04:31:25.847758527 +0000 UTC m=+437.843156134" lastFinishedPulling="2026-04-21 04:31:29.082777743 +0000 UTC m=+441.078175346" observedRunningTime="2026-04-21 04:31:30.091386774 +0000 UTC m=+442.086784400" watchObservedRunningTime="2026-04-21 04:31:30.093728071 +0000 UTC m=+442.089125698" Apr 21 04:31:30.514564 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:30.514531 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert\") pod \"odh-model-controller-858dbf95b8-7m6dg\" (UID: \"43abd5d6-298d-42b7-8e95-1fe108abadf7\") " pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:30.516920 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:30.516899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43abd5d6-298d-42b7-8e95-1fe108abadf7-cert\") pod \"odh-model-controller-858dbf95b8-7m6dg\" (UID: \"43abd5d6-298d-42b7-8e95-1fe108abadf7\") " pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:30.631931 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:30.631892 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:30.758198 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:30.758170 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7m6dg"] Apr 21 04:31:30.760136 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:31:30.760109 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43abd5d6_298d_42b7_8e95_1fe108abadf7.slice/crio-503d0058103d6146e61fb0e56a472692ad54791d8fd766545e8f5a2c65e91c56 WatchSource:0}: Error finding container 503d0058103d6146e61fb0e56a472692ad54791d8fd766545e8f5a2c65e91c56: Status 404 returned error can't find the container with id 503d0058103d6146e61fb0e56a472692ad54791d8fd766545e8f5a2c65e91c56 Apr 21 04:31:31.082711 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:31.082674 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" event={"ID":"43abd5d6-298d-42b7-8e95-1fe108abadf7","Type":"ContainerStarted","Data":"503d0058103d6146e61fb0e56a472692ad54791d8fd766545e8f5a2c65e91c56"} Apr 21 04:31:34.095051 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.094994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" event={"ID":"43abd5d6-298d-42b7-8e95-1fe108abadf7","Type":"ContainerStarted","Data":"01b8f06ba6990ab97ba3cbf27c863341df94d775d3fbf7d7adca41cec3a42bdf"} Apr 21 04:31:34.095051 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.095053 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:34.113354 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.113307 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" podStartSLOduration=2.929686699 podStartE2EDuration="6.113290967s" podCreationTimestamp="2026-04-21 04:31:28 +0000 UTC" firstStartedPulling="2026-04-21 04:31:30.761489268 +0000 UTC m=+442.756886871" lastFinishedPulling="2026-04-21 04:31:33.945093536 +0000 UTC m=+445.940491139" observedRunningTime="2026-04-21 04:31:34.111632924 +0000 UTC m=+446.107030550" watchObservedRunningTime="2026-04-21 04:31:34.113290967 +0000 UTC m=+446.108688592" Apr 21 04:31:34.621231 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.621195 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-h8xfm"] Apr 21 04:31:34.624693 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.624665 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:34.627241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.627218 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 21 04:31:34.627363 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.627276 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-p6975\"" Apr 21 04:31:34.639096 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.639073 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-h8xfm"] Apr 21 04:31:34.758457 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.758420 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqhzg\" (UniqueName: \"kubernetes.io/projected/feeb4dd4-c152-4989-9569-964252ccbd76-kube-api-access-vqhzg\") pod \"kserve-controller-manager-856948b99f-h8xfm\" (UID: \"feeb4dd4-c152-4989-9569-964252ccbd76\") " pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:34.758668 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.758492 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/feeb4dd4-c152-4989-9569-964252ccbd76-cert\") pod \"kserve-controller-manager-856948b99f-h8xfm\" (UID: \"feeb4dd4-c152-4989-9569-964252ccbd76\") " pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:34.859611 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.859569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqhzg\" (UniqueName: \"kubernetes.io/projected/feeb4dd4-c152-4989-9569-964252ccbd76-kube-api-access-vqhzg\") pod \"kserve-controller-manager-856948b99f-h8xfm\" (UID: \"feeb4dd4-c152-4989-9569-964252ccbd76\") " pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:34.859816 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.859642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/feeb4dd4-c152-4989-9569-964252ccbd76-cert\") pod \"kserve-controller-manager-856948b99f-h8xfm\" (UID: \"feeb4dd4-c152-4989-9569-964252ccbd76\") " pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:34.859816 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:31:34.859752 2579 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 04:31:34.859923 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:31:34.859817 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feeb4dd4-c152-4989-9569-964252ccbd76-cert podName:feeb4dd4-c152-4989-9569-964252ccbd76 nodeName:}" failed. No retries permitted until 2026-04-21 04:31:35.359798831 +0000 UTC m=+447.355196434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/feeb4dd4-c152-4989-9569-964252ccbd76-cert") pod "kserve-controller-manager-856948b99f-h8xfm" (UID: "feeb4dd4-c152-4989-9569-964252ccbd76") : secret "kserve-webhook-server-cert" not found Apr 21 04:31:34.870403 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:34.870377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqhzg\" (UniqueName: \"kubernetes.io/projected/feeb4dd4-c152-4989-9569-964252ccbd76-kube-api-access-vqhzg\") pod \"kserve-controller-manager-856948b99f-h8xfm\" (UID: \"feeb4dd4-c152-4989-9569-964252ccbd76\") " pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:35.100318 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:35.100283 2579 generic.go:358] "Generic (PLEG): container finished" podID="43abd5d6-298d-42b7-8e95-1fe108abadf7" containerID="01b8f06ba6990ab97ba3cbf27c863341df94d775d3fbf7d7adca41cec3a42bdf" exitCode=1 Apr 21 04:31:35.100787 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:35.100371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" event={"ID":"43abd5d6-298d-42b7-8e95-1fe108abadf7","Type":"ContainerDied","Data":"01b8f06ba6990ab97ba3cbf27c863341df94d775d3fbf7d7adca41cec3a42bdf"} Apr 21 04:31:35.100787 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:35.100648 2579 scope.go:117] "RemoveContainer" containerID="01b8f06ba6990ab97ba3cbf27c863341df94d775d3fbf7d7adca41cec3a42bdf" Apr 21 04:31:35.364467 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:35.364375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/feeb4dd4-c152-4989-9569-964252ccbd76-cert\") pod \"kserve-controller-manager-856948b99f-h8xfm\" (UID: \"feeb4dd4-c152-4989-9569-964252ccbd76\") " pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:35.366926 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:35.366904 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/feeb4dd4-c152-4989-9569-964252ccbd76-cert\") pod \"kserve-controller-manager-856948b99f-h8xfm\" (UID: \"feeb4dd4-c152-4989-9569-964252ccbd76\") " pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:35.537862 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:35.537827 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:35.695318 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:35.695284 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-h8xfm"] Apr 21 04:31:35.698375 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:31:35.698343 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeeb4dd4_c152_4989_9569_964252ccbd76.slice/crio-92270f569785901f9dbee2634b96887a3d290d1df0397d011b9fa9db4d7886ff WatchSource:0}: Error finding container 92270f569785901f9dbee2634b96887a3d290d1df0397d011b9fa9db4d7886ff: Status 404 returned error can't find the container with id 92270f569785901f9dbee2634b96887a3d290d1df0397d011b9fa9db4d7886ff Apr 21 04:31:36.105688 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:36.105651 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" event={"ID":"43abd5d6-298d-42b7-8e95-1fe108abadf7","Type":"ContainerStarted","Data":"d78a139cf726c768d224c073274a22ea532af67821a1fb5092444d40552c2ae1"} Apr 21 04:31:36.106197 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:36.105768 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:36.106759 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:36.106738 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" event={"ID":"feeb4dd4-c152-4989-9569-964252ccbd76","Type":"ContainerStarted","Data":"92270f569785901f9dbee2634b96887a3d290d1df0397d011b9fa9db4d7886ff"} Apr 21 04:31:38.088479 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.088443 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-stzs5"] Apr 21 04:31:38.093578 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.093532 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:38.096396 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.096370 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 04:31:38.096806 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.096407 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-z6mzf\"" Apr 21 04:31:38.096806 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.096464 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 04:31:38.104930 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.104716 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-stzs5"] Apr 21 04:31:38.193713 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.193667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/43fecfb5-d651-49aa-a1f3-09ef14e96cf8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-stzs5\" (UID: \"43fecfb5-d651-49aa-a1f3-09ef14e96cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:38.193904 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.193745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvqsk\" (UniqueName: \"kubernetes.io/projected/43fecfb5-d651-49aa-a1f3-09ef14e96cf8-kube-api-access-mvqsk\") pod \"servicemesh-operator3-55f49c5f94-stzs5\" (UID: \"43fecfb5-d651-49aa-a1f3-09ef14e96cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:38.295071 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.295022 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/43fecfb5-d651-49aa-a1f3-09ef14e96cf8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-stzs5\" (UID: \"43fecfb5-d651-49aa-a1f3-09ef14e96cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:38.295244 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.295102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvqsk\" (UniqueName: \"kubernetes.io/projected/43fecfb5-d651-49aa-a1f3-09ef14e96cf8-kube-api-access-mvqsk\") pod \"servicemesh-operator3-55f49c5f94-stzs5\" (UID: \"43fecfb5-d651-49aa-a1f3-09ef14e96cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:38.298257 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.298227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/43fecfb5-d651-49aa-a1f3-09ef14e96cf8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-stzs5\" (UID: \"43fecfb5-d651-49aa-a1f3-09ef14e96cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:38.304542 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.304443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvqsk\" (UniqueName: \"kubernetes.io/projected/43fecfb5-d651-49aa-a1f3-09ef14e96cf8-kube-api-access-mvqsk\") pod \"servicemesh-operator3-55f49c5f94-stzs5\" (UID: \"43fecfb5-d651-49aa-a1f3-09ef14e96cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:38.407773 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.407693 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:38.551228 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:38.551206 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-stzs5"] Apr 21 04:31:38.553457 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:31:38.553431 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43fecfb5_d651_49aa_a1f3_09ef14e96cf8.slice/crio-c1f242c44916ef090fb9dc0d06670ce65d18596944698f73a663bd773990afa5 WatchSource:0}: Error finding container c1f242c44916ef090fb9dc0d06670ce65d18596944698f73a663bd773990afa5: Status 404 returned error can't find the container with id c1f242c44916ef090fb9dc0d06670ce65d18596944698f73a663bd773990afa5 Apr 21 04:31:39.120812 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:39.120772 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" event={"ID":"43fecfb5-d651-49aa-a1f3-09ef14e96cf8","Type":"ContainerStarted","Data":"c1f242c44916ef090fb9dc0d06670ce65d18596944698f73a663bd773990afa5"} Apr 21 04:31:39.122415 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:39.122388 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" event={"ID":"feeb4dd4-c152-4989-9569-964252ccbd76","Type":"ContainerStarted","Data":"35b8a16cc3a008ffbe1375206d60c905eb42d1bd368330ade8792f1a47f33730"} Apr 21 04:31:39.122544 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:39.122508 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:31:39.144644 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:39.144588 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" podStartSLOduration=2.423848143 podStartE2EDuration="5.14457148s" podCreationTimestamp="2026-04-21 04:31:34 +0000 UTC" firstStartedPulling="2026-04-21 04:31:35.699951048 +0000 UTC m=+447.695348652" lastFinishedPulling="2026-04-21 04:31:38.420674366 +0000 UTC m=+450.416071989" observedRunningTime="2026-04-21 04:31:39.143235271 +0000 UTC m=+451.138632896" watchObservedRunningTime="2026-04-21 04:31:39.14457148 +0000 UTC m=+451.139969104" Apr 21 04:31:42.136384 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:42.136351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" event={"ID":"43fecfb5-d651-49aa-a1f3-09ef14e96cf8","Type":"ContainerStarted","Data":"4fa79165a06bed25773e1d947e365d04a2bb0f1d12ec01866955cc0dfd9a6357"} Apr 21 04:31:42.136798 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:42.136418 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:42.159066 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:42.159006 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" podStartSLOduration=1.548971484 podStartE2EDuration="4.158968823s" podCreationTimestamp="2026-04-21 04:31:38 +0000 UTC" firstStartedPulling="2026-04-21 04:31:38.555750823 +0000 UTC m=+450.551148427" lastFinishedPulling="2026-04-21 04:31:41.165748162 +0000 UTC m=+453.161145766" observedRunningTime="2026-04-21 04:31:42.157814025 +0000 UTC m=+454.153211651" watchObservedRunningTime="2026-04-21 04:31:42.158968823 +0000 UTC m=+454.154366447" Apr 21 04:31:47.113712 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:47.113678 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-7m6dg" Apr 21 04:31:53.141954 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.141922 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-stzs5" Apr 21 04:31:53.302603 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.302561 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc"] Apr 21 04:31:53.312292 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.312263 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.315125 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.315066 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 04:31:53.315303 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.315129 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-jpdgj\"" Apr 21 04:31:53.315303 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.315070 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 04:31:53.315303 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.315073 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 04:31:53.316105 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.316082 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 04:31:53.316404 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.316378 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc"] Apr 21 04:31:53.443779 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.443687 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.443936 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.443795 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mczcl\" (UniqueName: \"kubernetes.io/projected/c95c039e-0245-4c9d-b4b1-2e3b4132f997-kube-api-access-mczcl\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.443936 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.443825 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c95c039e-0245-4c9d-b4b1-2e3b4132f997-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.443936 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.443851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.443936 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.443893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.444127 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.443948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.444127 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.444005 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.544761 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.544715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mczcl\" (UniqueName: \"kubernetes.io/projected/c95c039e-0245-4c9d-b4b1-2e3b4132f997-kube-api-access-mczcl\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.544761 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.544766 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c95c039e-0245-4c9d-b4b1-2e3b4132f997-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.545054 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.544793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.545054 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.544837 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.545054 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.544876 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.545054 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.544921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.545054 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.544966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.545858 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.545827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.547463 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.547435 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c95c039e-0245-4c9d-b4b1-2e3b4132f997-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.547803 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.547778 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.547855 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.547795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.547891 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.547876 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c95c039e-0245-4c9d-b4b1-2e3b4132f997-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.558843 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.558815 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c95c039e-0245-4c9d-b4b1-2e3b4132f997-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.561864 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.561830 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mczcl\" (UniqueName: \"kubernetes.io/projected/c95c039e-0245-4c9d-b4b1-2e3b4132f997-kube-api-access-mczcl\") pod \"istiod-openshift-gateway-55ff986f96-v8rjc\" (UID: \"c95c039e-0245-4c9d-b4b1-2e3b4132f997\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.624132 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.624093 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:53.776095 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:53.776038 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc"] Apr 21 04:31:53.780521 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:31:53.780478 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc95c039e_0245_4c9d_b4b1_2e3b4132f997.slice/crio-c1bbf6245d18d1310ab5890c31c52eb32836b830f14c217b5ff3c75a1f2e122e WatchSource:0}: Error finding container c1bbf6245d18d1310ab5890c31c52eb32836b830f14c217b5ff3c75a1f2e122e: Status 404 returned error can't find the container with id c1bbf6245d18d1310ab5890c31c52eb32836b830f14c217b5ff3c75a1f2e122e Apr 21 04:31:54.184884 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:54.184839 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" event={"ID":"c95c039e-0245-4c9d-b4b1-2e3b4132f997","Type":"ContainerStarted","Data":"c1bbf6245d18d1310ab5890c31c52eb32836b830f14c217b5ff3c75a1f2e122e"} Apr 21 04:31:56.172684 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:56.172643 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:31:56.172963 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:56.172715 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:31:57.201000 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:57.200937 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" event={"ID":"c95c039e-0245-4c9d-b4b1-2e3b4132f997","Type":"ContainerStarted","Data":"95dcbbc08f1df171407690c3858d790b2526d4afba236b144c0c1695c729213c"} Apr 21 04:31:57.201561 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:57.201185 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:31:57.202903 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:57.202874 2579 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-v8rjc container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 21 04:31:57.203061 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:57.202940 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" podUID="c95c039e-0245-4c9d-b4b1-2e3b4132f997" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:31:57.219930 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:57.219870 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" podStartSLOduration=1.829696589 podStartE2EDuration="4.219853466s" podCreationTimestamp="2026-04-21 04:31:53 +0000 UTC" firstStartedPulling="2026-04-21 04:31:53.782218235 +0000 UTC m=+465.777615854" lastFinishedPulling="2026-04-21 04:31:56.172375118 +0000 UTC m=+468.167772731" observedRunningTime="2026-04-21 04:31:57.217608586 +0000 UTC m=+469.213006213" watchObservedRunningTime="2026-04-21 04:31:57.219853466 +0000 UTC m=+469.215251093" Apr 21 04:31:58.205481 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:31:58.205453 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-v8rjc" Apr 21 04:32:10.132354 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:10.132319 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-h8xfm" Apr 21 04:32:58.596414 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.596332 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-7g84k"] Apr 21 04:32:58.603432 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.603354 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-7g84k" Apr 21 04:32:58.606743 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.606708 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-jgt9z\"" Apr 21 04:32:58.606927 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.606893 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:32:58.607067 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.607048 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:32:58.609444 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.609418 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-7g84k"] Apr 21 04:32:58.659798 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.659756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bhg4\" (UniqueName: \"kubernetes.io/projected/f19c1b62-0615-405a-932b-b479e46dd822-kube-api-access-5bhg4\") pod \"authorino-operator-657f44b778-7g84k\" (UID: \"f19c1b62-0615-405a-932b-b479e46dd822\") " pod="kuadrant-system/authorino-operator-657f44b778-7g84k" Apr 21 04:32:58.761237 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.761197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bhg4\" (UniqueName: \"kubernetes.io/projected/f19c1b62-0615-405a-932b-b479e46dd822-kube-api-access-5bhg4\") pod \"authorino-operator-657f44b778-7g84k\" (UID: \"f19c1b62-0615-405a-932b-b479e46dd822\") " pod="kuadrant-system/authorino-operator-657f44b778-7g84k" Apr 21 04:32:58.769592 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.769558 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bhg4\" (UniqueName: \"kubernetes.io/projected/f19c1b62-0615-405a-932b-b479e46dd822-kube-api-access-5bhg4\") pod \"authorino-operator-657f44b778-7g84k\" (UID: \"f19c1b62-0615-405a-932b-b479e46dd822\") " pod="kuadrant-system/authorino-operator-657f44b778-7g84k" Apr 21 04:32:58.915793 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:58.915705 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-7g84k" Apr 21 04:32:59.251029 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:59.251002 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-7g84k"] Apr 21 04:32:59.253600 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:32:59.253558 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf19c1b62_0615_405a_932b_b479e46dd822.slice/crio-6caff30ba95dbc0827e186a6eeb121d27186cb92b5acee2bf9d07d9e644c1722 WatchSource:0}: Error finding container 6caff30ba95dbc0827e186a6eeb121d27186cb92b5acee2bf9d07d9e644c1722: Status 404 returned error can't find the container with id 6caff30ba95dbc0827e186a6eeb121d27186cb92b5acee2bf9d07d9e644c1722 Apr 21 04:32:59.426099 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:32:59.426062 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-7g84k" event={"ID":"f19c1b62-0615-405a-932b-b479e46dd822","Type":"ContainerStarted","Data":"6caff30ba95dbc0827e186a6eeb121d27186cb92b5acee2bf9d07d9e644c1722"} Apr 21 04:33:01.434265 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:01.434230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-7g84k" event={"ID":"f19c1b62-0615-405a-932b-b479e46dd822","Type":"ContainerStarted","Data":"2d979bfebb6bb6ab6d03ec4eac9845bc191fd2e5f845267d3d2379b545e47854"} Apr 21 04:33:01.434652 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:01.434283 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-7g84k" Apr 21 04:33:01.454195 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:01.454144 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-7g84k" podStartSLOduration=1.671350185 podStartE2EDuration="3.454129436s" podCreationTimestamp="2026-04-21 04:32:58 +0000 UTC" firstStartedPulling="2026-04-21 04:32:59.255606387 +0000 UTC m=+531.251004003" lastFinishedPulling="2026-04-21 04:33:01.038385651 +0000 UTC m=+533.033783254" observedRunningTime="2026-04-21 04:33:01.451942615 +0000 UTC m=+533.447340267" watchObservedRunningTime="2026-04-21 04:33:01.454129436 +0000 UTC m=+533.449527107" Apr 21 04:33:12.442274 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:12.442240 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-7g84k" Apr 21 04:33:13.274139 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.274099 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58bff7c76-kfcch"] Apr 21 04:33:13.278070 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.278044 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.304966 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.304936 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58bff7c76-kfcch"] Apr 21 04:33:13.407193 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.407155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2122a54-56fa-482b-9f5b-d3d422d55947-console-oauth-config\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.407193 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.407201 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-service-ca\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.407403 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.407266 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-trusted-ca-bundle\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.407403 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.407363 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2122a54-56fa-482b-9f5b-d3d422d55947-console-serving-cert\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.407403 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.407390 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-console-config\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.407507 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.407417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-oauth-serving-cert\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.407507 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.407484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8fvg\" (UniqueName: \"kubernetes.io/projected/e2122a54-56fa-482b-9f5b-d3d422d55947-kube-api-access-r8fvg\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.508362 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.508329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2122a54-56fa-482b-9f5b-d3d422d55947-console-oauth-config\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.508837 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.508372 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-service-ca\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.508837 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.508406 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-trusted-ca-bundle\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.508837 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.508445 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2122a54-56fa-482b-9f5b-d3d422d55947-console-serving-cert\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.508837 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.508469 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-console-config\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.508837 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.508498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-oauth-serving-cert\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.509131 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.508615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8fvg\" (UniqueName: \"kubernetes.io/projected/e2122a54-56fa-482b-9f5b-d3d422d55947-kube-api-access-r8fvg\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.509336 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.509312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-oauth-serving-cert\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.509427 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.509312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-service-ca\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.509487 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.509455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-console-config\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.509554 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.509533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2122a54-56fa-482b-9f5b-d3d422d55947-trusted-ca-bundle\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.511447 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.511425 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2122a54-56fa-482b-9f5b-d3d422d55947-console-serving-cert\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.511447 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.511442 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2122a54-56fa-482b-9f5b-d3d422d55947-console-oauth-config\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.522237 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.522198 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8fvg\" (UniqueName: \"kubernetes.io/projected/e2122a54-56fa-482b-9f5b-d3d422d55947-kube-api-access-r8fvg\") pod \"console-58bff7c76-kfcch\" (UID: \"e2122a54-56fa-482b-9f5b-d3d422d55947\") " pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.587444 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.587405 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:13.713390 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:13.713368 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58bff7c76-kfcch"] Apr 21 04:33:13.715610 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:33:13.715581 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2122a54_56fa_482b_9f5b_d3d422d55947.slice/crio-c92dcc3e654352cedd8c0f3c5e53542f4416ff0f658ceeb68dee6dd1e2e5401d WatchSource:0}: Error finding container c92dcc3e654352cedd8c0f3c5e53542f4416ff0f658ceeb68dee6dd1e2e5401d: Status 404 returned error can't find the container with id c92dcc3e654352cedd8c0f3c5e53542f4416ff0f658ceeb68dee6dd1e2e5401d Apr 21 04:33:14.487948 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:14.487910 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bff7c76-kfcch" event={"ID":"e2122a54-56fa-482b-9f5b-d3d422d55947","Type":"ContainerStarted","Data":"36927058e577f44a0115dc1638b83e1e22a75221f720ce6d5d7f1ad66b17c4d7"} Apr 21 04:33:14.487948 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:14.487946 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bff7c76-kfcch" event={"ID":"e2122a54-56fa-482b-9f5b-d3d422d55947","Type":"ContainerStarted","Data":"c92dcc3e654352cedd8c0f3c5e53542f4416ff0f658ceeb68dee6dd1e2e5401d"} Apr 21 04:33:14.507119 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:14.507072 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58bff7c76-kfcch" podStartSLOduration=1.507056033 podStartE2EDuration="1.507056033s" podCreationTimestamp="2026-04-21 04:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:33:14.505368209 +0000 UTC m=+546.500765845" watchObservedRunningTime="2026-04-21 04:33:14.507056033 +0000 UTC m=+546.502453658" Apr 21 04:33:23.587812 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:23.587757 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:23.587812 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:23.587812 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:23.592524 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:23.592499 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:24.529379 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.529347 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58bff7c76-kfcch" Apr 21 04:33:24.590232 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.590202 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-569bf775d7-5kphr"] Apr 21 04:33:24.634723 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.634692 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v"] Apr 21 04:33:24.638743 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.638713 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:24.641180 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.641158 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-vc2h7\"" Apr 21 04:33:24.649709 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.649684 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v"] Apr 21 04:33:24.715577 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.715541 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7628a44-9dc1-4b0f-890b-6a979a5983fe-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-lq78v\" (UID: \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:24.715577 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.715581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44st\" (UniqueName: \"kubernetes.io/projected/a7628a44-9dc1-4b0f-890b-6a979a5983fe-kube-api-access-g44st\") pod \"kuadrant-operator-controller-manager-84b657d985-lq78v\" (UID: \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:24.817248 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.817157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7628a44-9dc1-4b0f-890b-6a979a5983fe-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-lq78v\" (UID: \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:24.817248 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.817202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g44st\" (UniqueName: \"kubernetes.io/projected/a7628a44-9dc1-4b0f-890b-6a979a5983fe-kube-api-access-g44st\") pod \"kuadrant-operator-controller-manager-84b657d985-lq78v\" (UID: \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:24.817563 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.817542 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7628a44-9dc1-4b0f-890b-6a979a5983fe-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-lq78v\" (UID: \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:24.826433 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.826403 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44st\" (UniqueName: \"kubernetes.io/projected/a7628a44-9dc1-4b0f-890b-6a979a5983fe-kube-api-access-g44st\") pod \"kuadrant-operator-controller-manager-84b657d985-lq78v\" (UID: \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:24.952011 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:24.951964 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:25.085458 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.085433 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v"] Apr 21 04:33:25.088298 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:33:25.088266 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7628a44_9dc1_4b0f_890b_6a979a5983fe.slice/crio-9dbfc41384adaa5fb70bb4c77ede09f053ae251d8810ab25f2b6d96df76c8bae WatchSource:0}: Error finding container 9dbfc41384adaa5fb70bb4c77ede09f053ae251d8810ab25f2b6d96df76c8bae: Status 404 returned error can't find the container with id 9dbfc41384adaa5fb70bb4c77ede09f053ae251d8810ab25f2b6d96df76c8bae Apr 21 04:33:25.138750 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.138716 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v"] Apr 21 04:33:25.150703 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.150673 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v"] Apr 21 04:33:25.164971 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.164939 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62"] Apr 21 04:33:25.169840 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.169817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:25.195171 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.195143 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62"] Apr 21 04:33:25.223093 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.223062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-rxf62\" (UID: \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:25.223282 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.223261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdndq\" (UniqueName: \"kubernetes.io/projected/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-kube-api-access-wdndq\") pod \"kuadrant-operator-controller-manager-84b657d985-rxf62\" (UID: \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:25.324445 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.324402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdndq\" (UniqueName: \"kubernetes.io/projected/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-kube-api-access-wdndq\") pod \"kuadrant-operator-controller-manager-84b657d985-rxf62\" (UID: \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:25.324445 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.324449 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-rxf62\" (UID: \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:25.324859 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.324841 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-rxf62\" (UID: \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:25.335902 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.335881 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdndq\" (UniqueName: \"kubernetes.io/projected/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-kube-api-access-wdndq\") pod \"kuadrant-operator-controller-manager-84b657d985-rxf62\" (UID: \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:25.482222 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.482178 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:25.628017 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:25.625961 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62"] Apr 21 04:33:25.663537 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:33:25.663486 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fdac87b_2831_46c3_85ef_d6c26b42f7e2.slice/crio-99f1426c449d89151a37ea3ee12a6f17e324514b63164a488e1bf450e53f70bb WatchSource:0}: Error finding container 99f1426c449d89151a37ea3ee12a6f17e324514b63164a488e1bf450e53f70bb: Status 404 returned error can't find the container with id 99f1426c449d89151a37ea3ee12a6f17e324514b63164a488e1bf450e53f70bb Apr 21 04:33:26.536162 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:26.536108 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" event={"ID":"7fdac87b-2831-46c3-85ef-d6c26b42f7e2","Type":"ContainerStarted","Data":"99f1426c449d89151a37ea3ee12a6f17e324514b63164a488e1bf450e53f70bb"} Apr 21 04:33:29.549183 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.549146 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" event={"ID":"7fdac87b-2831-46c3-85ef-d6c26b42f7e2","Type":"ContainerStarted","Data":"122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c"} Apr 21 04:33:29.549655 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.549246 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:29.550732 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.550706 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" containerName="manager" containerID="cri-o://0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12" gracePeriod=2 Apr 21 04:33:29.575220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.573364 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" podStartSLOduration=1.200934998 podStartE2EDuration="4.573344403s" podCreationTimestamp="2026-04-21 04:33:25 +0000 UTC" firstStartedPulling="2026-04-21 04:33:25.666057181 +0000 UTC m=+557.661454785" lastFinishedPulling="2026-04-21 04:33:29.038466577 +0000 UTC m=+561.033864190" observedRunningTime="2026-04-21 04:33:29.571056079 +0000 UTC m=+561.566453707" watchObservedRunningTime="2026-04-21 04:33:29.573344403 +0000 UTC m=+561.568742028" Apr 21 04:33:29.575220 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.573425 2579 status_manager.go:895] "Failed to get status for pod" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" err="pods \"kuadrant-operator-controller-manager-84b657d985-lq78v\" is forbidden: User \"system:node:ip-10-0-134-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-45.ec2.internal' and this object" Apr 21 04:33:29.797624 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.797600 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:29.800157 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.800083 2579 status_manager.go:895] "Failed to get status for pod" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" err="pods \"kuadrant-operator-controller-manager-84b657d985-lq78v\" is forbidden: User \"system:node:ip-10-0-134-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-45.ec2.internal' and this object" Apr 21 04:33:29.869071 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.869034 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7628a44-9dc1-4b0f-890b-6a979a5983fe-extensions-socket-volume\") pod \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\" (UID: \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\") " Apr 21 04:33:29.869206 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.869099 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g44st\" (UniqueName: \"kubernetes.io/projected/a7628a44-9dc1-4b0f-890b-6a979a5983fe-kube-api-access-g44st\") pod \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\" (UID: \"a7628a44-9dc1-4b0f-890b-6a979a5983fe\") " Apr 21 04:33:29.869318 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.869293 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7628a44-9dc1-4b0f-890b-6a979a5983fe-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "a7628a44-9dc1-4b0f-890b-6a979a5983fe" (UID: "a7628a44-9dc1-4b0f-890b-6a979a5983fe"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:33:29.871296 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.871271 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7628a44-9dc1-4b0f-890b-6a979a5983fe-kube-api-access-g44st" (OuterVolumeSpecName: "kube-api-access-g44st") pod "a7628a44-9dc1-4b0f-890b-6a979a5983fe" (UID: "a7628a44-9dc1-4b0f-890b-6a979a5983fe"). InnerVolumeSpecName "kube-api-access-g44st". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:33:29.969738 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.969704 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g44st\" (UniqueName: \"kubernetes.io/projected/a7628a44-9dc1-4b0f-890b-6a979a5983fe-kube-api-access-g44st\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:29.969738 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:29.969733 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7628a44-9dc1-4b0f-890b-6a979a5983fe-extensions-socket-volume\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:30.513450 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:30.513412 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" path="/var/lib/kubelet/pods/a7628a44-9dc1-4b0f-890b-6a979a5983fe/volumes" Apr 21 04:33:30.555725 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:30.555689 2579 generic.go:358] "Generic (PLEG): container finished" podID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" containerID="0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12" exitCode=2 Apr 21 04:33:30.556203 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:30.555737 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" Apr 21 04:33:30.556203 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:30.555782 2579 scope.go:117] "RemoveContainer" containerID="0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12" Apr 21 04:33:30.563559 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:30.558594 2579 status_manager.go:895] "Failed to get status for pod" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" err="pods \"kuadrant-operator-controller-manager-84b657d985-lq78v\" is forbidden: User \"system:node:ip-10-0-134-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-45.ec2.internal' and this object" Apr 21 04:33:30.563559 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:30.561581 2579 status_manager.go:895] "Failed to get status for pod" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-lq78v" err="pods \"kuadrant-operator-controller-manager-84b657d985-lq78v\" is forbidden: User \"system:node:ip-10-0-134-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-45.ec2.internal' and this object" Apr 21 04:33:30.572043 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:30.572019 2579 scope.go:117] "RemoveContainer" containerID="0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12" Apr 21 04:33:30.572377 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:33:30.572354 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12\": container with ID starting with 0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12 not found: ID does not exist" containerID="0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12" Apr 21 04:33:30.572433 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:30.572386 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12"} err="failed to get container status \"0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12\": rpc error: code = NotFound desc = could not find container \"0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12\": container with ID starting with 0a354f0b74ea3c6b572623bc629c28c5abac7e04c84fd5e90ec08255e27b6f12 not found: ID does not exist" Apr 21 04:33:40.558357 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:40.558322 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:49.611238 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.611194 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-569bf775d7-5kphr" podUID="efcb6cee-ac1d-413a-a627-9420a7b72b5b" containerName="console" containerID="cri-o://529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef" gracePeriod=15 Apr 21 04:33:49.698466 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.698430 2579 patch_prober.go:28] interesting pod/console-569bf775d7-5kphr container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.23:8443/health\": dial tcp 10.134.0.23:8443: connect: connection refused" start-of-body= Apr 21 04:33:49.698658 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.698499 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-569bf775d7-5kphr" podUID="efcb6cee-ac1d-413a-a627-9420a7b72b5b" containerName="console" probeResult="failure" output="Get \"https://10.134.0.23:8443/health\": dial tcp 10.134.0.23:8443: connect: connection refused" Apr 21 04:33:49.867593 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.867523 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-569bf775d7-5kphr_efcb6cee-ac1d-413a-a627-9420a7b72b5b/console/0.log" Apr 21 04:33:49.867741 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.867605 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:33:49.956668 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.956622 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpgkx\" (UniqueName: \"kubernetes.io/projected/efcb6cee-ac1d-413a-a627-9420a7b72b5b-kube-api-access-qpgkx\") pod \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " Apr 21 04:33:49.956846 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.956686 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-config\") pod \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " Apr 21 04:33:49.956846 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.956730 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-service-ca\") pod \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " Apr 21 04:33:49.956846 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.956762 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-trusted-ca-bundle\") pod \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " Apr 21 04:33:49.956846 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.956798 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-oauth-config\") pod \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " Apr 21 04:33:49.957105 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.956854 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-oauth-serving-cert\") pod \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " Apr 21 04:33:49.957105 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.956880 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-serving-cert\") pod \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\" (UID: \"efcb6cee-ac1d-413a-a627-9420a7b72b5b\") " Apr 21 04:33:49.957245 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.957218 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-config" (OuterVolumeSpecName: "console-config") pod "efcb6cee-ac1d-413a-a627-9420a7b72b5b" (UID: "efcb6cee-ac1d-413a-a627-9420a7b72b5b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:33:49.957297 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.957235 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-service-ca" (OuterVolumeSpecName: "service-ca") pod "efcb6cee-ac1d-413a-a627-9420a7b72b5b" (UID: "efcb6cee-ac1d-413a-a627-9420a7b72b5b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:33:49.957297 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.957254 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "efcb6cee-ac1d-413a-a627-9420a7b72b5b" (UID: "efcb6cee-ac1d-413a-a627-9420a7b72b5b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:33:49.957367 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.957299 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "efcb6cee-ac1d-413a-a627-9420a7b72b5b" (UID: "efcb6cee-ac1d-413a-a627-9420a7b72b5b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:33:49.959176 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.959145 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efcb6cee-ac1d-413a-a627-9420a7b72b5b-kube-api-access-qpgkx" (OuterVolumeSpecName: "kube-api-access-qpgkx") pod "efcb6cee-ac1d-413a-a627-9420a7b72b5b" (UID: "efcb6cee-ac1d-413a-a627-9420a7b72b5b"). InnerVolumeSpecName "kube-api-access-qpgkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:33:49.959176 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.959148 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "efcb6cee-ac1d-413a-a627-9420a7b72b5b" (UID: "efcb6cee-ac1d-413a-a627-9420a7b72b5b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:33:49.959347 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:49.959221 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "efcb6cee-ac1d-413a-a627-9420a7b72b5b" (UID: "efcb6cee-ac1d-413a-a627-9420a7b72b5b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:33:50.057773 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.057736 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-service-ca\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:50.057773 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.057774 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-trusted-ca-bundle\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:50.058268 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.057787 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-oauth-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:50.058268 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.057800 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-oauth-serving-cert\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:50.058268 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.057814 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-serving-cert\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:50.058268 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.057827 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qpgkx\" (UniqueName: \"kubernetes.io/projected/efcb6cee-ac1d-413a-a627-9420a7b72b5b-kube-api-access-qpgkx\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:50.058268 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.057843 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efcb6cee-ac1d-413a-a627-9420a7b72b5b-console-config\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:50.633620 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.633593 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-569bf775d7-5kphr_efcb6cee-ac1d-413a-a627-9420a7b72b5b/console/0.log" Apr 21 04:33:50.634074 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.633633 2579 generic.go:358] "Generic (PLEG): container finished" podID="efcb6cee-ac1d-413a-a627-9420a7b72b5b" containerID="529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef" exitCode=2 Apr 21 04:33:50.634074 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.633667 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-569bf775d7-5kphr" event={"ID":"efcb6cee-ac1d-413a-a627-9420a7b72b5b","Type":"ContainerDied","Data":"529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef"} Apr 21 04:33:50.634074 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.633699 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-569bf775d7-5kphr" Apr 21 04:33:50.634074 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.633718 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-569bf775d7-5kphr" event={"ID":"efcb6cee-ac1d-413a-a627-9420a7b72b5b","Type":"ContainerDied","Data":"389554b53e078364cdd2011b35b55aef5089c2e32a9a69b33f56a37ede1e989c"} Apr 21 04:33:50.634074 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.633742 2579 scope.go:117] "RemoveContainer" containerID="529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef" Apr 21 04:33:50.642438 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.642423 2579 scope.go:117] "RemoveContainer" containerID="529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef" Apr 21 04:33:50.642685 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:33:50.642666 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef\": container with ID starting with 529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef not found: ID does not exist" containerID="529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef" Apr 21 04:33:50.642735 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.642695 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef"} err="failed to get container status \"529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef\": rpc error: code = NotFound desc = could not find container \"529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef\": container with ID starting with 529989a1c6fe215da95e258852092da3fd10b8aaad40e309de7900da6175a3ef not found: ID does not exist" Apr 21 04:33:50.650791 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.650769 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-569bf775d7-5kphr"] Apr 21 04:33:50.654432 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:50.654412 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-569bf775d7-5kphr"] Apr 21 04:33:52.512898 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:52.512864 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efcb6cee-ac1d-413a-a627-9420a7b72b5b" path="/var/lib/kubelet/pods/efcb6cee-ac1d-413a-a627-9420a7b72b5b/volumes" Apr 21 04:33:58.222209 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.222168 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62"] Apr 21 04:33:58.222673 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.222462 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" podUID="7fdac87b-2831-46c3-85ef-d6c26b42f7e2" containerName="manager" containerID="cri-o://122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c" gracePeriod=10 Apr 21 04:33:58.494117 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.494044 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:58.536061 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.536029 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdndq\" (UniqueName: \"kubernetes.io/projected/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-kube-api-access-wdndq\") pod \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\" (UID: \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\") " Apr 21 04:33:58.536061 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.536071 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-extensions-socket-volume\") pod \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\" (UID: \"7fdac87b-2831-46c3-85ef-d6c26b42f7e2\") " Apr 21 04:33:58.536590 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.536565 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "7fdac87b-2831-46c3-85ef-d6c26b42f7e2" (UID: "7fdac87b-2831-46c3-85ef-d6c26b42f7e2"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:33:58.538328 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.538298 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-kube-api-access-wdndq" (OuterVolumeSpecName: "kube-api-access-wdndq") pod "7fdac87b-2831-46c3-85ef-d6c26b42f7e2" (UID: "7fdac87b-2831-46c3-85ef-d6c26b42f7e2"). InnerVolumeSpecName "kube-api-access-wdndq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:33:58.636873 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.636834 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wdndq\" (UniqueName: \"kubernetes.io/projected/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-kube-api-access-wdndq\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:58.636873 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.636870 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7fdac87b-2831-46c3-85ef-d6c26b42f7e2-extensions-socket-volume\") on node \"ip-10-0-134-45.ec2.internal\" DevicePath \"\"" Apr 21 04:33:58.664235 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.664198 2579 generic.go:358] "Generic (PLEG): container finished" podID="7fdac87b-2831-46c3-85ef-d6c26b42f7e2" containerID="122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c" exitCode=0 Apr 21 04:33:58.664376 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.664271 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" Apr 21 04:33:58.664376 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.664283 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" event={"ID":"7fdac87b-2831-46c3-85ef-d6c26b42f7e2","Type":"ContainerDied","Data":"122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c"} Apr 21 04:33:58.664376 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.664324 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62" event={"ID":"7fdac87b-2831-46c3-85ef-d6c26b42f7e2","Type":"ContainerDied","Data":"99f1426c449d89151a37ea3ee12a6f17e324514b63164a488e1bf450e53f70bb"} Apr 21 04:33:58.664376 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.664342 2579 scope.go:117] "RemoveContainer" containerID="122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c" Apr 21 04:33:58.677650 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.677631 2579 scope.go:117] "RemoveContainer" containerID="122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c" Apr 21 04:33:58.678214 ip-10-0-134-45 kubenswrapper[2579]: E0421 04:33:58.678189 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c\": container with ID starting with 122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c not found: ID does not exist" containerID="122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c" Apr 21 04:33:58.678314 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.678224 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c"} err="failed to get container status \"122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c\": rpc error: code = NotFound desc = could not find container \"122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c\": container with ID starting with 122238f3e369728afd007e970c4af2d4d26dd2b1ce5b9dbf25c41f60f1387e4c not found: ID does not exist" Apr 21 04:33:58.689066 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.689030 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62"] Apr 21 04:33:58.697641 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:33:58.697610 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rxf62"] Apr 21 04:34:00.513933 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:00.513903 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fdac87b-2831-46c3-85ef-d6c26b42f7e2" path="/var/lib/kubelet/pods/7fdac87b-2831-46c3-85ef-d6c26b42f7e2/volumes" Apr 21 04:34:08.436759 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:08.436731 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:34:08.437317 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:08.436855 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:34:08.440550 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:08.440529 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:34:08.440786 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:08.440766 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:34:16.474832 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.474787 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:34:16.475401 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475380 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efcb6cee-ac1d-413a-a627-9420a7b72b5b" containerName="console" Apr 21 04:34:16.475482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475404 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcb6cee-ac1d-413a-a627-9420a7b72b5b" containerName="console" Apr 21 04:34:16.475482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475427 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" containerName="manager" Apr 21 04:34:16.475482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475436 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" containerName="manager" Apr 21 04:34:16.475482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475446 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fdac87b-2831-46c3-85ef-d6c26b42f7e2" containerName="manager" Apr 21 04:34:16.475482 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475456 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdac87b-2831-46c3-85ef-d6c26b42f7e2" containerName="manager" Apr 21 04:34:16.475714 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475574 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="efcb6cee-ac1d-413a-a627-9420a7b72b5b" containerName="console" Apr 21 04:34:16.475714 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475590 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7628a44-9dc1-4b0f-890b-6a979a5983fe" containerName="manager" Apr 21 04:34:16.475714 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.475601 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fdac87b-2831-46c3-85ef-d6c26b42f7e2" containerName="manager" Apr 21 04:34:16.482419 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.482394 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:16.485005 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.484957 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 04:34:16.485580 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.485558 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cbpgz\"" Apr 21 04:34:16.487225 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.487197 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:34:16.513598 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.513563 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:34:16.596211 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.596159 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfgl\" (UniqueName: \"kubernetes.io/projected/f33d42ee-30bb-4cd7-b436-cfbfa29e22cf-kube-api-access-sbfgl\") pod \"limitador-limitador-78c99df468-4wv7p\" (UID: \"f33d42ee-30bb-4cd7-b436-cfbfa29e22cf\") " pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:16.596405 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.596268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f33d42ee-30bb-4cd7-b436-cfbfa29e22cf-config-file\") pod \"limitador-limitador-78c99df468-4wv7p\" (UID: \"f33d42ee-30bb-4cd7-b436-cfbfa29e22cf\") " pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:16.697103 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.697065 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f33d42ee-30bb-4cd7-b436-cfbfa29e22cf-config-file\") pod \"limitador-limitador-78c99df468-4wv7p\" (UID: \"f33d42ee-30bb-4cd7-b436-cfbfa29e22cf\") " pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:16.697282 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.697206 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfgl\" (UniqueName: \"kubernetes.io/projected/f33d42ee-30bb-4cd7-b436-cfbfa29e22cf-kube-api-access-sbfgl\") pod \"limitador-limitador-78c99df468-4wv7p\" (UID: \"f33d42ee-30bb-4cd7-b436-cfbfa29e22cf\") " pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:16.697683 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.697663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f33d42ee-30bb-4cd7-b436-cfbfa29e22cf-config-file\") pod \"limitador-limitador-78c99df468-4wv7p\" (UID: \"f33d42ee-30bb-4cd7-b436-cfbfa29e22cf\") " pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:16.706286 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.706254 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfgl\" (UniqueName: \"kubernetes.io/projected/f33d42ee-30bb-4cd7-b436-cfbfa29e22cf-kube-api-access-sbfgl\") pod \"limitador-limitador-78c99df468-4wv7p\" (UID: \"f33d42ee-30bb-4cd7-b436-cfbfa29e22cf\") " pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:16.793900 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.793802 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:16.923816 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:16.923625 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:34:16.926914 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:34:16.926885 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf33d42ee_30bb_4cd7_b436_cfbfa29e22cf.slice/crio-6677a411338522bba97890777a827f42dd9b5940fd99e903b299feaeb4d89c39 WatchSource:0}: Error finding container 6677a411338522bba97890777a827f42dd9b5940fd99e903b299feaeb4d89c39: Status 404 returned error can't find the container with id 6677a411338522bba97890777a827f42dd9b5940fd99e903b299feaeb4d89c39 Apr 21 04:34:17.744800 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:17.744758 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" event={"ID":"f33d42ee-30bb-4cd7-b436-cfbfa29e22cf","Type":"ContainerStarted","Data":"6677a411338522bba97890777a827f42dd9b5940fd99e903b299feaeb4d89c39"} Apr 21 04:34:19.754331 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:19.754292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" event={"ID":"f33d42ee-30bb-4cd7-b436-cfbfa29e22cf","Type":"ContainerStarted","Data":"e01c79e366f18c37ba82f5e4c936bd2a71e73f99f283f28f48c922c4840bc603"} Apr 21 04:34:19.754774 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:19.754475 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:19.774949 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:19.774892 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" podStartSLOduration=1.397881247 podStartE2EDuration="3.774878363s" podCreationTimestamp="2026-04-21 04:34:16 +0000 UTC" firstStartedPulling="2026-04-21 04:34:16.929116505 +0000 UTC m=+608.924514107" lastFinishedPulling="2026-04-21 04:34:19.306113605 +0000 UTC m=+611.301511223" observedRunningTime="2026-04-21 04:34:19.773091141 +0000 UTC m=+611.768488764" watchObservedRunningTime="2026-04-21 04:34:19.774878363 +0000 UTC m=+611.770275987" Apr 21 04:34:30.758959 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:30.758926 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-4wv7p" Apr 21 04:34:54.623791 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.623754 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-lq8tm"] Apr 21 04:34:54.631500 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.631480 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-lq8tm" Apr 21 04:34:54.635207 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.635182 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 04:34:54.635388 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.635371 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 04:34:54.636180 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.636165 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-cz8kg\"" Apr 21 04:34:54.640325 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.640305 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-lq8tm"] Apr 21 04:34:54.744339 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.744298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdqsq\" (UniqueName: \"kubernetes.io/projected/1d750638-4390-4047-a702-848c12df7db2-kube-api-access-kdqsq\") pod \"keycloak-operator-5c4df598dd-lq8tm\" (UID: \"1d750638-4390-4047-a702-848c12df7db2\") " pod="keycloak-system/keycloak-operator-5c4df598dd-lq8tm" Apr 21 04:34:54.844832 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.844793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdqsq\" (UniqueName: \"kubernetes.io/projected/1d750638-4390-4047-a702-848c12df7db2-kube-api-access-kdqsq\") pod \"keycloak-operator-5c4df598dd-lq8tm\" (UID: \"1d750638-4390-4047-a702-848c12df7db2\") " pod="keycloak-system/keycloak-operator-5c4df598dd-lq8tm" Apr 21 04:34:54.853583 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.853546 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdqsq\" (UniqueName: \"kubernetes.io/projected/1d750638-4390-4047-a702-848c12df7db2-kube-api-access-kdqsq\") pod \"keycloak-operator-5c4df598dd-lq8tm\" (UID: \"1d750638-4390-4047-a702-848c12df7db2\") " pod="keycloak-system/keycloak-operator-5c4df598dd-lq8tm" Apr 21 04:34:54.943131 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:54.943029 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-lq8tm" Apr 21 04:34:55.067041 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:55.067013 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-lq8tm"] Apr 21 04:34:55.068816 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:34:55.068787 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d750638_4390_4047_a702_848c12df7db2.slice/crio-dba9127e706fda9ea80dc42299b550dc5fa14d586243a135ba6230418aac8dfb WatchSource:0}: Error finding container dba9127e706fda9ea80dc42299b550dc5fa14d586243a135ba6230418aac8dfb: Status 404 returned error can't find the container with id dba9127e706fda9ea80dc42299b550dc5fa14d586243a135ba6230418aac8dfb Apr 21 04:34:55.070250 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:55.070233 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:34:55.885289 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:34:55.885243 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-lq8tm" event={"ID":"1d750638-4390-4047-a702-848c12df7db2","Type":"ContainerStarted","Data":"dba9127e706fda9ea80dc42299b550dc5fa14d586243a135ba6230418aac8dfb"} Apr 21 04:35:00.905967 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:35:00.905862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-lq8tm" event={"ID":"1d750638-4390-4047-a702-848c12df7db2","Type":"ContainerStarted","Data":"80d5813fc47b55671c67924540108b5c8e895065d6c5b9a6819528c2f405f529"} Apr 21 04:35:00.922119 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:35:00.922066 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-lq8tm" podStartSLOduration=1.378557396 podStartE2EDuration="6.922049024s" podCreationTimestamp="2026-04-21 04:34:54 +0000 UTC" firstStartedPulling="2026-04-21 04:34:55.070354409 +0000 UTC m=+647.065752012" lastFinishedPulling="2026-04-21 04:35:00.613846038 +0000 UTC m=+652.609243640" observedRunningTime="2026-04-21 04:35:00.920053456 +0000 UTC m=+652.915451081" watchObservedRunningTime="2026-04-21 04:35:00.922049024 +0000 UTC m=+652.917446648" Apr 21 04:36:05.782345 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:36:05.782263 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:37:09.153383 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:37:09.153345 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:37:13.156105 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:37:13.156062 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:37:20.856510 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:37:20.856474 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:37:26.294579 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:37:26.294536 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:37:36.559124 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:37:36.559082 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-4wv7p"] Apr 21 04:39:08.461651 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:08.461572 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:39:08.464022 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:08.463974 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:39:08.465548 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:08.465528 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:39:08.467569 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:08.467550 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:39:16.660272 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:16.660245 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-h8xfm_feeb4dd4-c152-4989-9569-964252ccbd76/manager/0.log" Apr 21 04:39:17.000493 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:17.000410 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7m6dg_43abd5d6-298d-42b7-8e95-1fe108abadf7/manager/1.log" Apr 21 04:39:17.219510 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:17.219479 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5b6f69cdb8-prmx2_fb165b48-6a27-452d-9a33-c95836b74e28/manager/0.log" Apr 21 04:39:18.774524 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:18.774488 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-7g84k_f19c1b62-0615-405a-932b-b479e46dd822/manager/0.log" Apr 21 04:39:19.323247 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:19.323210 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-4wv7p_f33d42ee-30bb-4cd7-b436-cfbfa29e22cf/limitador/0.log" Apr 21 04:39:19.885031 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:19.884998 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-v8rjc_c95c039e-0245-4c9d-b4b1-2e3b4132f997/discovery/0.log" Apr 21 04:39:19.992723 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:19.992696 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5f98864f9-b9mm6_4dd96334-68c0-4483-956b-259dfb8cb842/kube-auth-proxy/0.log" Apr 21 04:39:20.314873 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:20.314795 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-f8d5bbc98-2ffvt_e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5/router/0.log" Apr 21 04:39:24.664445 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.664413 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pv5rw/must-gather-9r8mg"] Apr 21 04:39:24.668383 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.668362 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pv5rw/must-gather-9r8mg" Apr 21 04:39:24.670992 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.670956 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pv5rw\"/\"openshift-service-ca.crt\"" Apr 21 04:39:24.671203 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.671185 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pv5rw\"/\"kube-root-ca.crt\"" Apr 21 04:39:24.671949 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.671930 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pv5rw\"/\"default-dockercfg-zc9pn\"" Apr 21 04:39:24.674090 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.674065 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pv5rw/must-gather-9r8mg"] Apr 21 04:39:24.812570 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.812539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jdt\" (UniqueName: \"kubernetes.io/projected/cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8-kube-api-access-q6jdt\") pod \"must-gather-9r8mg\" (UID: \"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8\") " pod="openshift-must-gather-pv5rw/must-gather-9r8mg" Apr 21 04:39:24.812734 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.812586 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8-must-gather-output\") pod \"must-gather-9r8mg\" (UID: \"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8\") " pod="openshift-must-gather-pv5rw/must-gather-9r8mg" Apr 21 04:39:24.913493 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.913459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jdt\" (UniqueName: \"kubernetes.io/projected/cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8-kube-api-access-q6jdt\") pod \"must-gather-9r8mg\" (UID: \"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8\") " pod="openshift-must-gather-pv5rw/must-gather-9r8mg" Apr 21 04:39:24.913659 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.913504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8-must-gather-output\") pod \"must-gather-9r8mg\" (UID: \"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8\") " pod="openshift-must-gather-pv5rw/must-gather-9r8mg" Apr 21 04:39:24.913799 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.913785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8-must-gather-output\") pod \"must-gather-9r8mg\" (UID: \"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8\") " pod="openshift-must-gather-pv5rw/must-gather-9r8mg" Apr 21 04:39:24.921609 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.921549 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jdt\" (UniqueName: \"kubernetes.io/projected/cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8-kube-api-access-q6jdt\") pod \"must-gather-9r8mg\" (UID: \"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8\") " pod="openshift-must-gather-pv5rw/must-gather-9r8mg" Apr 21 04:39:24.978813 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:24.978779 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pv5rw/must-gather-9r8mg" Apr 21 04:39:25.101710 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:25.101683 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pv5rw/must-gather-9r8mg"] Apr 21 04:39:25.103968 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:39:25.103941 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc0d2b5c_77c9_45e6_ba54_c56b169ea0d8.slice/crio-ca836258f51a3c6255e191920f9adf78f8f1324b3be2c8390d11c3d1768297f4 WatchSource:0}: Error finding container ca836258f51a3c6255e191920f9adf78f8f1324b3be2c8390d11c3d1768297f4: Status 404 returned error can't find the container with id ca836258f51a3c6255e191920f9adf78f8f1324b3be2c8390d11c3d1768297f4 Apr 21 04:39:25.894218 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:25.894180 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/must-gather-9r8mg" event={"ID":"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8","Type":"ContainerStarted","Data":"ca836258f51a3c6255e191920f9adf78f8f1324b3be2c8390d11c3d1768297f4"} Apr 21 04:39:26.899448 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:26.899406 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/must-gather-9r8mg" event={"ID":"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8","Type":"ContainerStarted","Data":"1c4c6e7520c2977b7232bc6c2c64ff41d94531b9ce659caeac35eb870d1a0d38"} Apr 21 04:39:26.899448 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:26.899454 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/must-gather-9r8mg" event={"ID":"cc0d2b5c-77c9-45e6-ba54-c56b169ea0d8","Type":"ContainerStarted","Data":"62be03a285ee3aeb3147eb00d24d2f775d5d3e0332f4d8a165fa4f376023b75d"} Apr 21 04:39:26.916058 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:26.915975 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pv5rw/must-gather-9r8mg" podStartSLOduration=2.090956994 podStartE2EDuration="2.915953913s" podCreationTimestamp="2026-04-21 04:39:24 +0000 UTC" firstStartedPulling="2026-04-21 04:39:25.105881634 +0000 UTC m=+917.101279238" lastFinishedPulling="2026-04-21 04:39:25.930878555 +0000 UTC m=+917.926276157" observedRunningTime="2026-04-21 04:39:26.914510779 +0000 UTC m=+918.909908437" watchObservedRunningTime="2026-04-21 04:39:26.915953913 +0000 UTC m=+918.911351535" Apr 21 04:39:27.485863 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:27.485834 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-brbpw_80c0f40e-55cf-4d32-b714-e175b15308f4/global-pull-secret-syncer/0.log" Apr 21 04:39:27.663886 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:27.663846 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xktdv_8641cd54-f81a-4c30-ba62-309b434777a4/konnectivity-agent/0.log" Apr 21 04:39:27.681338 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:27.681303 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-45.ec2.internal_ec711311d3275e119b3dff245c5b47c4/haproxy/0.log" Apr 21 04:39:32.024939 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:32.024905 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-7g84k_f19c1b62-0615-405a-932b-b479e46dd822/manager/0.log" Apr 21 04:39:32.180639 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:32.180612 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-4wv7p_f33d42ee-30bb-4cd7-b436-cfbfa29e22cf/limitador/0.log" Apr 21 04:39:33.591081 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.591047 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_883ea867-3996-4c55-b3de-5f56c61c1997/alertmanager/0.log" Apr 21 04:39:33.622993 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.622944 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_883ea867-3996-4c55-b3de-5f56c61c1997/config-reloader/0.log" Apr 21 04:39:33.644550 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.644475 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_883ea867-3996-4c55-b3de-5f56c61c1997/kube-rbac-proxy-web/0.log" Apr 21 04:39:33.667846 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.667812 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_883ea867-3996-4c55-b3de-5f56c61c1997/kube-rbac-proxy/0.log" Apr 21 04:39:33.687836 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.687803 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_883ea867-3996-4c55-b3de-5f56c61c1997/kube-rbac-proxy-metric/0.log" Apr 21 04:39:33.712946 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.712918 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_883ea867-3996-4c55-b3de-5f56c61c1997/prom-label-proxy/0.log" Apr 21 04:39:33.732642 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.732613 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_883ea867-3996-4c55-b3de-5f56c61c1997/init-config-reloader/0.log" Apr 21 04:39:33.766106 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.766076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-qbhw5_0bfc0f36-e3dc-4add-97d2-eef31c8c896d/cluster-monitoring-operator/0.log" Apr 21 04:39:33.999430 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:33.999358 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mnh2b_93066b03-f6c2-4051-8789-c5995156fed3/node-exporter/0.log" Apr 21 04:39:34.020382 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.020352 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mnh2b_93066b03-f6c2-4051-8789-c5995156fed3/kube-rbac-proxy/0.log" Apr 21 04:39:34.042655 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.042629 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mnh2b_93066b03-f6c2-4051-8789-c5995156fed3/init-textfile/0.log" Apr 21 04:39:34.135169 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.135133 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5kxmv_344d432b-3faa-4a09-831b-42861da22f34/kube-rbac-proxy-main/0.log" Apr 21 04:39:34.157047 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.157012 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5kxmv_344d432b-3faa-4a09-831b-42861da22f34/kube-rbac-proxy-self/0.log" Apr 21 04:39:34.177072 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.177035 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5kxmv_344d432b-3faa-4a09-831b-42861da22f34/openshift-state-metrics/0.log" Apr 21 04:39:34.441676 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.441645 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7cbbf7d57f-m6rcg_9327c526-d54d-4f72-83db-46a507db3823/telemeter-client/0.log" Apr 21 04:39:34.461559 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.461525 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7cbbf7d57f-m6rcg_9327c526-d54d-4f72-83db-46a507db3823/reload/0.log" Apr 21 04:39:34.480443 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.480410 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7cbbf7d57f-m6rcg_9327c526-d54d-4f72-83db-46a507db3823/kube-rbac-proxy/0.log" Apr 21 04:39:34.508145 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.508012 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d576574b4-xrm2c_183f0560-9cb9-4beb-97c2-c870a1526a12/thanos-query/0.log" Apr 21 04:39:34.526628 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.526597 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d576574b4-xrm2c_183f0560-9cb9-4beb-97c2-c870a1526a12/kube-rbac-proxy-web/0.log" Apr 21 04:39:34.547019 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.546976 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d576574b4-xrm2c_183f0560-9cb9-4beb-97c2-c870a1526a12/kube-rbac-proxy/0.log" Apr 21 04:39:34.565470 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.565441 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d576574b4-xrm2c_183f0560-9cb9-4beb-97c2-c870a1526a12/prom-label-proxy/0.log" Apr 21 04:39:34.584466 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.584438 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d576574b4-xrm2c_183f0560-9cb9-4beb-97c2-c870a1526a12/kube-rbac-proxy-rules/0.log" Apr 21 04:39:34.602600 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:34.602576 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d576574b4-xrm2c_183f0560-9cb9-4beb-97c2-c870a1526a12/kube-rbac-proxy-metrics/0.log" Apr 21 04:39:36.169436 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.169395 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n"] Apr 21 04:39:36.176943 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.176913 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.181403 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.181372 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n"] Apr 21 04:39:36.233552 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.233517 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-proc\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.233552 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.233559 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-lib-modules\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.233805 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.233661 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-sys\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.233805 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.233713 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fplrk\" (UniqueName: \"kubernetes.io/projected/9682c2a5-9802-46ce-a904-252bca7e97bd-kube-api-access-fplrk\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.233805 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.233756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-podres\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.243571 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.243544 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/2.log" Apr 21 04:39:36.249408 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.249381 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gl6cc_0f1153f0-d097-4947-a74c-4824967f6184/console-operator/3.log" Apr 21 04:39:36.335356 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-proc\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.335551 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335368 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-lib-modules\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.335551 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-sys\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.335551 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fplrk\" (UniqueName: \"kubernetes.io/projected/9682c2a5-9802-46ce-a904-252bca7e97bd-kube-api-access-fplrk\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.335551 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-podres\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.335763 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335715 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-podres\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.335817 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-proc\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.335885 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335863 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-lib-modules\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.335943 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.335919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9682c2a5-9802-46ce-a904-252bca7e97bd-sys\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.344713 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.344683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fplrk\" (UniqueName: \"kubernetes.io/projected/9682c2a5-9802-46ce-a904-252bca7e97bd-kube-api-access-fplrk\") pod \"perf-node-gather-daemonset-d2g9n\" (UID: \"9682c2a5-9802-46ce-a904-252bca7e97bd\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.493002 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.492893 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:36.658387 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.658360 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n"] Apr 21 04:39:36.662753 ip-10-0-134-45 kubenswrapper[2579]: W0421 04:39:36.661973 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9682c2a5_9802_46ce_a904_252bca7e97bd.slice/crio-671a0f03f7287b6c85cb84f5228a517fc24976fe74258269c4e1b0cb07453cb8 WatchSource:0}: Error finding container 671a0f03f7287b6c85cb84f5228a517fc24976fe74258269c4e1b0cb07453cb8: Status 404 returned error can't find the container with id 671a0f03f7287b6c85cb84f5228a517fc24976fe74258269c4e1b0cb07453cb8 Apr 21 04:39:36.753392 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.753301 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58bff7c76-kfcch_e2122a54-56fa-482b-9f5b-d3d422d55947/console/0.log" Apr 21 04:39:36.785191 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.785006 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-l2wqz_3f33fd64-8497-4fb4-8b9b-6b34fd8564fa/download-server/0.log" Apr 21 04:39:36.951941 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.951905 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" event={"ID":"9682c2a5-9802-46ce-a904-252bca7e97bd","Type":"ContainerStarted","Data":"bbf3ae51c5c935e2effaf3ccb4cf24609ac58053d50a9a5600bfc9655906ea96"} Apr 21 04:39:36.951941 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.951942 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" event={"ID":"9682c2a5-9802-46ce-a904-252bca7e97bd","Type":"ContainerStarted","Data":"671a0f03f7287b6c85cb84f5228a517fc24976fe74258269c4e1b0cb07453cb8"} Apr 21 04:39:36.968655 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:36.968597 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" podStartSLOduration=0.968577713 podStartE2EDuration="968.577713ms" podCreationTimestamp="2026-04-21 04:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:39:36.968149995 +0000 UTC m=+928.963547626" watchObservedRunningTime="2026-04-21 04:39:36.968577713 +0000 UTC m=+928.963975339" Apr 21 04:39:37.956705 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:37.956663 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:38.106545 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:38.106517 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rnhvz_12bdc8fa-32f4-4a0d-85b9-2e5c19724e76/dns/0.log" Apr 21 04:39:38.124840 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:38.124815 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rnhvz_12bdc8fa-32f4-4a0d-85b9-2e5c19724e76/kube-rbac-proxy/0.log" Apr 21 04:39:38.204564 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:38.204536 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h5xvg_ee93154d-8192-4132-b610-52bffab2fc10/dns-node-resolver/0.log" Apr 21 04:39:38.810695 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:38.810667 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7fd975c7bf-tx9ck_8b40fb2d-b3c9-4470-9ef8-62415d18d8f3/registry/0.log" Apr 21 04:39:38.829201 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:38.829176 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cp8gk_f5fd5747-67eb-4782-b7d2-6b81f0d51528/node-ca/0.log" Apr 21 04:39:39.702202 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:39.702172 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-v8rjc_c95c039e-0245-4c9d-b4b1-2e3b4132f997/discovery/0.log" Apr 21 04:39:39.722472 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:39.722443 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5f98864f9-b9mm6_4dd96334-68c0-4483-956b-259dfb8cb842/kube-auth-proxy/0.log" Apr 21 04:39:39.790758 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:39.790726 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-f8d5bbc98-2ffvt_e407ea03-e7e2-4851-9eb4-f4d38ed0e3b5/router/0.log" Apr 21 04:39:40.253841 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:40.253808 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p27tc_83f15b2a-86b0-4400-a5f1-ff037093ddcc/serve-healthcheck-canary/0.log" Apr 21 04:39:40.712857 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:40.712828 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-f96bf_99b02ee4-b901-4a4f-9b7a-2ddd30492de6/insights-operator/0.log" Apr 21 04:39:40.713370 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:40.712877 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-f96bf_99b02ee4-b901-4a4f-9b7a-2ddd30492de6/insights-operator/1.log" Apr 21 04:39:40.862239 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:40.862207 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mmwxg_b78187ed-6b4c-4aa8-8ae4-326e14f2342e/kube-rbac-proxy/0.log" Apr 21 04:39:40.880043 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:40.880014 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mmwxg_b78187ed-6b4c-4aa8-8ae4-326e14f2342e/exporter/0.log" Apr 21 04:39:40.898903 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:40.898874 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mmwxg_b78187ed-6b4c-4aa8-8ae4-326e14f2342e/extractor/0.log" Apr 21 04:39:42.746899 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:42.746871 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-h8xfm_feeb4dd4-c152-4989-9569-964252ccbd76/manager/0.log" Apr 21 04:39:42.809578 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:42.809554 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7m6dg_43abd5d6-298d-42b7-8e95-1fe108abadf7/manager/0.log" Apr 21 04:39:42.822356 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:42.822323 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7m6dg_43abd5d6-298d-42b7-8e95-1fe108abadf7/manager/1.log" Apr 21 04:39:42.886873 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:42.886838 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5b6f69cdb8-prmx2_fb165b48-6a27-452d-9a33-c95836b74e28/manager/0.log" Apr 21 04:39:43.971562 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:43.971532 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-d2g9n" Apr 21 04:39:49.766449 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:49.766416 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-26djf_9f8dd91d-0597-4a80-87f6-46b919d9a0ef/kube-multus/0.log" Apr 21 04:39:49.901460 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:49.901428 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dbmj_861cd341-7a89-4a04-83c3-db9adea07535/kube-multus-additional-cni-plugins/0.log" Apr 21 04:39:49.938595 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:49.938552 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dbmj_861cd341-7a89-4a04-83c3-db9adea07535/egress-router-binary-copy/0.log" Apr 21 04:39:49.978526 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:49.978470 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dbmj_861cd341-7a89-4a04-83c3-db9adea07535/cni-plugins/0.log" Apr 21 04:39:50.021392 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:50.021322 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dbmj_861cd341-7a89-4a04-83c3-db9adea07535/bond-cni-plugin/0.log" Apr 21 04:39:50.061401 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:50.061375 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dbmj_861cd341-7a89-4a04-83c3-db9adea07535/routeoverride-cni/0.log" Apr 21 04:39:50.098287 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:50.098255 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dbmj_861cd341-7a89-4a04-83c3-db9adea07535/whereabouts-cni-bincopy/0.log" Apr 21 04:39:50.138364 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:50.138337 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dbmj_861cd341-7a89-4a04-83c3-db9adea07535/whereabouts-cni/0.log" Apr 21 04:39:50.681732 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:50.681701 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lkbgz_ebfdc6e1-781b-42a1-b442-ba40dfec626c/network-metrics-daemon/0.log" Apr 21 04:39:50.704965 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:50.704937 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lkbgz_ebfdc6e1-781b-42a1-b442-ba40dfec626c/kube-rbac-proxy/0.log" Apr 21 04:39:51.442241 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.442215 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-controller/0.log" Apr 21 04:39:51.457003 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.456955 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/0.log" Apr 21 04:39:51.461728 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.461708 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovn-acl-logging/1.log" Apr 21 04:39:51.477442 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.477416 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/kube-rbac-proxy-node/0.log" Apr 21 04:39:51.495711 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.495670 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 04:39:51.511621 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.511598 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/northd/0.log" Apr 21 04:39:51.534198 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.534165 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/nbdb/0.log" Apr 21 04:39:51.552366 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.552343 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/sbdb/0.log" Apr 21 04:39:51.667281 ip-10-0-134-45 kubenswrapper[2579]: I0421 04:39:51.667248 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqlz8_b6a42934-9530-4770-ba35-f71911b5b3b3/ovnkube-controller/0.log"