Apr 23 16:32:40.631631 ip-10-0-134-187 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 16:32:40.631642 ip-10-0-134-187 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 16:32:40.631649 ip-10-0-134-187 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 16:32:40.631889 ip-10-0-134-187 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 16:32:50.881216 ip-10-0-134-187 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 16:32:50.881235 ip-10-0-134-187 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d5f18954afcb41c3bcc9605b51a61539 -- Apr 23 16:35:26.142502 ip-10-0-134-187 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:35:26.619263 ip-10-0-134-187 kubenswrapper[2563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:26.619263 ip-10-0-134-187 kubenswrapper[2563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:35:26.619263 ip-10-0-134-187 kubenswrapper[2563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:26.619263 ip-10-0-134-187 kubenswrapper[2563]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:35:26.619263 ip-10-0-134-187 kubenswrapper[2563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:26.621336 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.621240 2563 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:35:26.623557 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623537 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:26.623557 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623553 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:26.623557 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623558 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:26.623557 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623563 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623570 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623575 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623579 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623583 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623588 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623592 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623596 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623599 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623604 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623607 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623611 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623615 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623626 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623631 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623635 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623639 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623643 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623647 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:26.623780 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623650 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623654 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623658 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623662 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623666 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623670 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623675 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623679 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623683 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623687 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623691 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623696 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623700 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623704 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623709 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623714 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623719 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623723 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623728 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:26.624562 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623732 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623736 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623743 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623749 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623755 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623760 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623764 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623768 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623773 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623777 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623782 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623785 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623789 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623794 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623798 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623802 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623806 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623810 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623814 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623819 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:26.625348 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623823 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623827 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623831 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623835 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623840 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623844 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623848 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623853 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623857 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623862 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623866 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623870 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623877 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623881 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623885 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623889 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623893 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623898 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623903 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623908 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:26.626081 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623912 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:26.626837 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623916 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:26.626837 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623921 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:26.626837 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623925 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:26.626837 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.623929 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:26.629958 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629940 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:26.629958 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629958 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629964 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629968 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629973 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629977 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629982 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629986 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629991 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.629995 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630000 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630004 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630008 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630012 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630016 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630019 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630023 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630026 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630028 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630031 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:26.630125 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630034 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630036 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630039 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630042 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630045 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630048 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630050 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630053 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630056 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630059 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630062 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630064 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630067 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630071 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630076 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630080 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630082 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630086 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630089 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630091 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:26.630624 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630094 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630096 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630099 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630102 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630104 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630107 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630110 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630112 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630115 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630117 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630120 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630122 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630125 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630127 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630130 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630132 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630135 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630138 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630140 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630143 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:26.631135 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630146 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630148 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630151 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630153 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630156 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630158 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630161 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630165 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630168 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630171 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630173 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630176 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630179 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630181 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630184 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630187 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630190 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630193 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630197 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630199 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:26.631673 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630202 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630206 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630210 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630213 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630216 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630219 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630314 2563 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630325 2563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630335 2563 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630340 2563 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630345 2563 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630349 2563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630354 2563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630358 2563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630361 2563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630364 2563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630368 2563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630371 2563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630375 2563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630378 2563 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630381 2563 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630384 2563 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630387 2563 flags.go:64] FLAG: --cloud-config="" Apr 23 16:35:26.632200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630390 2563 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630393 2563 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630397 2563 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630400 2563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630403 2563 flags.go:64] FLAG: --config-dir="" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630406 2563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630409 2563 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630413 2563 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630418 2563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630421 2563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630424 2563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630428 2563 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630431 2563 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630434 2563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630437 2563 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630440 2563 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630444 2563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630447 2563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630450 2563 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630453 2563 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630456 2563 flags.go:64] FLAG: --enable-server="true" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630459 2563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630464 2563 flags.go:64] FLAG: --event-burst="100" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630472 2563 flags.go:64] FLAG: --event-qps="50" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630475 2563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:35:26.632779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630478 2563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630481 2563 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630485 2563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630488 2563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630491 2563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630494 2563 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630497 2563 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630500 2563 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630503 2563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630506 2563 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630509 2563 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630511 2563 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630514 2563 flags.go:64] FLAG: --feature-gates="" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630518 2563 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630521 2563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630524 2563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630528 2563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630531 2563 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630534 2563 flags.go:64] FLAG: --help="false" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630537 2563 flags.go:64] FLAG: --hostname-override="ip-10-0-134-187.ec2.internal" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630540 2563 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630543 2563 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630546 2563 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630549 2563 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:35:26.633407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630553 2563 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630555 2563 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630558 2563 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630561 2563 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630564 2563 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630567 2563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630571 2563 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630574 2563 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630577 2563 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630580 2563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630583 2563 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630586 2563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630589 2563 flags.go:64] FLAG: --lock-file="" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630591 2563 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630594 2563 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630597 2563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630603 2563 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630615 2563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630618 2563 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630621 2563 flags.go:64] FLAG: --logging-format="text" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630624 2563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630628 2563 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630631 2563 flags.go:64] FLAG: --manifest-url="" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630634 2563 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630639 2563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:35:26.633977 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630643 2563 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630649 2563 flags.go:64] FLAG: --max-pods="110" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630652 2563 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630655 2563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630658 2563 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630661 2563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630664 2563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630667 2563 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630670 2563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630678 2563 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630681 2563 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630684 2563 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630687 2563 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630691 2563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630696 2563 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630699 2563 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630702 2563 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630705 2563 flags.go:64] FLAG: --port="10250" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630708 2563 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630711 2563 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-082fc51fd694d92ec" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630714 2563 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630717 2563 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630720 2563 flags.go:64] FLAG: --register-node="true" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630723 2563 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:35:26.634596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630726 2563 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630729 2563 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630732 2563 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630735 2563 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630738 2563 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630741 2563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630744 2563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630747 2563 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630750 2563 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630754 2563 flags.go:64] FLAG: --runonce="false" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630757 2563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630760 2563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630763 2563 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630765 2563 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630768 2563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630771 2563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630774 2563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630777 2563 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630780 2563 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630782 2563 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630785 2563 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630788 2563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630791 2563 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630794 2563 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630797 2563 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:35:26.635177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630822 2563 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630826 2563 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630830 2563 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630834 2563 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630837 2563 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630840 2563 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630843 2563 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630846 2563 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630849 2563 flags.go:64] FLAG: --v="2" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630853 2563 flags.go:64] FLAG: --version="false" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630858 2563 flags.go:64] FLAG: --vmodule="" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630862 2563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.630865 2563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630952 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630956 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630960 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630963 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630967 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630970 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630973 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630976 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630979 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630982 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:26.635794 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630984 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630987 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630990 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630993 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630996 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.630998 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631001 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631004 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631007 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631009 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631012 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631015 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631017 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631020 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631022 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631025 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631028 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631030 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631033 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631035 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:26.636357 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631038 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631040 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631043 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631045 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631048 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631051 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631053 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631057 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631059 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631063 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631066 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631069 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631071 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631074 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631076 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631079 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631081 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631084 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631086 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:26.636895 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631089 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631091 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631094 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631097 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631099 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631102 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631105 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631107 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631110 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631112 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631115 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631117 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631119 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631122 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631125 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631127 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631130 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631132 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631135 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631138 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:26.637380 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631140 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631144 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631146 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631149 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631152 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631155 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631157 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631161 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631164 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631166 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631169 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631173 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631177 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631180 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631184 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631187 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:26.637872 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.631190 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:26.638292 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.631950 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:26.638329 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.638302 2563 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:35:26.638329 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.638318 2563 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:35:26.638385 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638364 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:26.638385 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638369 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:26.638385 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638373 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:26.638385 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638376 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:26.638385 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638379 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:26.638385 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638382 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:26.638385 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638385 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:26.638385 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638388 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638392 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638394 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638397 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638400 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638403 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638406 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638409 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638412 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638414 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638417 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638419 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638422 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638424 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638427 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638429 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638432 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638435 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638437 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638439 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:26.638579 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638442 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638445 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638447 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638450 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638452 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638455 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638457 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638460 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638462 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638465 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638467 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638470 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638472 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638475 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638478 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638481 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638484 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638486 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638489 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:26.639077 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638492 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638494 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638497 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638499 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638502 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638504 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638506 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638509 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638512 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638514 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638517 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638519 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638522 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638524 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638527 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638529 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638532 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638534 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638537 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638539 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:26.639554 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638542 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638544 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638548 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638552 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638555 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638559 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638563 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638567 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638570 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638572 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638575 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638577 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638580 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638583 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638586 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638589 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638591 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638594 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638597 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:26.640032 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638599 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.638611 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638705 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638709 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638711 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638714 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638717 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638719 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638722 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638724 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638727 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638729 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638732 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638735 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638737 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638739 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:26.640548 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638742 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638744 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638747 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638749 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638752 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638756 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638760 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638763 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638766 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638768 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638771 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638775 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638777 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638780 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638782 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638785 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638788 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638790 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638793 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638795 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:26.640932 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638797 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638800 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638803 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638805 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638808 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638810 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638813 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638815 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638818 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638820 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638823 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638825 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638828 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638830 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638833 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638835 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638838 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638840 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638843 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:26.641434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638845 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638848 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638850 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638853 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638855 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638858 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638861 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638863 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638865 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638868 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638870 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638873 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638876 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638880 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638882 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638885 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638887 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638890 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638892 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638895 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:26.641884 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638897 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638900 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638902 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638905 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638907 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638910 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638912 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638915 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638917 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638920 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638923 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638925 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:26.638928 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.638933 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:26.642393 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.639745 2563 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:35:26.646798 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.646783 2563 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:35:26.648187 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.648173 2563 server.go:1019] "Starting client certificate rotation" Apr 23 16:35:26.648273 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.648256 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:26.648309 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.648292 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:26.676713 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.676696 2563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:26.679191 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.679172 2563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:26.697854 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.697831 2563 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:35:26.704599 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.704578 2563 log.go:25] "Validated CRI v1 image API" Apr 23 16:35:26.705891 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.705866 2563 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:35:26.708550 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.708523 2563 fs.go:135] Filesystem UUIDs: map[6d3f1c37-5eb6-43be-9e53-44f3d67f0599:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 eab57106-da54-4026-9b03-dda6dcadc687:/dev/nvme0n1p3] Apr 23 16:35:26.708614 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.708550 2563 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:35:26.709725 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.709705 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:26.714734 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.714596 2563 manager.go:217] Machine: {Timestamp:2026-04-23 16:35:26.712381674 +0000 UTC m=+0.444136118 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3076881 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec208bd4c49d28f6ae7edb11a578943d SystemUUID:ec208bd4-c49d-28f6-ae7e-db11a578943d BootID:d5f18954-afcb-41c3-bcc9-605b51a61539 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:be:5a:b8:29:c1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:be:5a:b8:29:c1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:4e:3e:f8:f1:1b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:35:26.714734 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.714728 2563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:35:26.714847 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.714801 2563 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:35:26.716016 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.715996 2563 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:35:26.716153 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.716018 2563 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-187.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:35:26.716198 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.716161 2563 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:35:26.716198 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.716171 2563 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:35:26.716198 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.716183 2563 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:26.717019 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.717008 2563 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:26.717859 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.717850 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:26.717962 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.717954 2563 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:35:26.720467 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.720457 2563 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:35:26.720518 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.720475 2563 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:35:26.720518 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.720486 2563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:35:26.720518 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.720495 2563 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:35:26.720518 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.720504 2563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:35:26.722428 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.722411 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:26.722520 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.722436 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:26.725394 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.725378 2563 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:35:26.726714 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.726699 2563 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:35:26.728543 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728528 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728549 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728556 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728562 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728568 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728574 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728580 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728585 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728592 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:35:26.728605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728598 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:35:26.728832 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728611 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:35:26.728832 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.728620 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:35:26.730479 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.730469 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:35:26.730479 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.730479 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:35:26.733967 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.733953 2563 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:35:26.734046 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.733988 2563 server.go:1295] "Started kubelet" Apr 23 16:35:26.734092 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.734074 2563 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:35:26.734581 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.734523 2563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:35:26.734687 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.734600 2563 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:35:26.734745 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.734704 2563 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-187.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:35:26.734870 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.734837 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:35:26.734938 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.734923 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-187.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:35:26.735027 ip-10-0-134-187 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:35:26.735571 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.735481 2563 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:35:26.735863 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.735846 2563 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:35:26.736549 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.736531 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g54cm" Apr 23 16:35:26.740734 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.740719 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:26.740825 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.739204 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-187.ec2.internal.18a909a3b74cf629 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-187.ec2.internal,UID:ip-10-0-134-187.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-187.ec2.internal,},FirstTimestamp:2026-04-23 16:35:26.733964841 +0000 UTC m=+0.465719265,LastTimestamp:2026-04-23 16:35:26.733964841 +0000 UTC m=+0.465719265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-187.ec2.internal,}" Apr 23 16:35:26.741257 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.741223 2563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:35:26.743949 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.743929 2563 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:35:26.744218 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.744057 2563 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:35:26.744218 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.744220 2563 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:35:26.744375 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.744273 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:26.744375 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.744317 2563 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:35:26.744467 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.744417 2563 factory.go:55] Registering systemd factory Apr 23 16:35:26.744467 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.744429 2563 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:35:26.744973 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.744952 2563 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:35:26.744973 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.744969 2563 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:35:26.745320 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.745304 2563 factory.go:153] Registering CRI-O factory Apr 23 16:35:26.745450 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.745430 2563 factory.go:223] Registration of the crio container factory successfully Apr 23 16:35:26.745560 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.745543 2563 factory.go:103] Registering Raw factory Apr 23 16:35:26.745674 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.745647 2563 manager.go:1196] Started watching for new ooms in manager Apr 23 16:35:26.745737 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.745465 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g54cm" Apr 23 16:35:26.746049 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.745939 2563 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:35:26.747475 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.747448 2563 manager.go:319] Starting recovery of all containers Apr 23 16:35:26.752732 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.752619 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:26.755085 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.755064 2563 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-187.ec2.internal\" not found" node="ip-10-0-134-187.ec2.internal" Apr 23 16:35:26.756929 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.756918 2563 manager.go:324] Recovery completed Apr 23 16:35:26.760722 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.760709 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:26.762896 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.762883 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:26.762958 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.762906 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:26.762958 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.762917 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:26.763311 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.763296 2563 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:35:26.763311 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.763309 2563 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:35:26.763451 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.763327 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:26.765825 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.765810 2563 policy_none.go:49] "None policy: Start" Apr 23 16:35:26.765904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.765829 2563 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:35:26.765904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.765842 2563 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:35:26.808195 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.808182 2563 manager.go:341] "Starting Device Plugin manager" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.808213 2563 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.808222 2563 server.go:85] "Starting device plugin registration server" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.808429 2563 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.808437 2563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.808527 2563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.808603 2563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.808611 2563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.809156 2563 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:35:26.817999 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.809190 2563 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:26.873657 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.873614 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:35:26.874723 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.874706 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:35:26.874805 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.874733 2563 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:35:26.874805 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.874750 2563 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:35:26.874805 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.874756 2563 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:35:26.874805 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.874785 2563 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:35:26.880410 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.880391 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:26.908504 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.908488 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:26.909547 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.909532 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:26.909616 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.909562 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:26.909616 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.909573 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:26.909616 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.909595 2563 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-187.ec2.internal" Apr 23 16:35:26.917310 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.917296 2563 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-187.ec2.internal" Apr 23 16:35:26.917383 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.917315 2563 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-187.ec2.internal\": node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:26.941177 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:26.940015 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:26.975542 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.975501 2563 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal"] Apr 23 16:35:26.975615 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.975576 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:26.976287 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.976273 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:26.976340 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.976297 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:26.976340 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.976308 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:26.977547 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.977535 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:26.977692 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.977679 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" Apr 23 16:35:26.977732 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.977705 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:26.978155 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.978141 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:26.978205 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.978179 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:26.978205 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.978143 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:26.978294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.978209 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:26.978294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.978222 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:26.978294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.978191 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:26.979465 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.979452 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:26.979515 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.979475 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:26.980071 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.980057 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:26.980129 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.980083 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:26.980129 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:26.980095 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:27.003753 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.003729 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-187.ec2.internal\" not found" node="ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.008137 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.008123 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-187.ec2.internal\" not found" node="ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.040580 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.040563 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.046673 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.046657 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9f2167486c68501ab6cd4222066784e7-config\") pod \"kube-apiserver-proxy-ip-10-0-134-187.ec2.internal\" (UID: \"9f2167486c68501ab6cd4222066784e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.046719 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.046679 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b69db5b49b12174f2220cf83978dbb2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal\" (UID: \"7b69db5b49b12174f2220cf83978dbb2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.046719 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.046697 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b69db5b49b12174f2220cf83978dbb2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal\" (UID: \"7b69db5b49b12174f2220cf83978dbb2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.141587 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.141545 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.146830 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.146816 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9f2167486c68501ab6cd4222066784e7-config\") pod \"kube-apiserver-proxy-ip-10-0-134-187.ec2.internal\" (UID: \"9f2167486c68501ab6cd4222066784e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.146892 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.146838 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b69db5b49b12174f2220cf83978dbb2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal\" (UID: \"7b69db5b49b12174f2220cf83978dbb2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.146892 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.146856 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b69db5b49b12174f2220cf83978dbb2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal\" (UID: \"7b69db5b49b12174f2220cf83978dbb2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.146989 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.146893 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b69db5b49b12174f2220cf83978dbb2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal\" (UID: \"7b69db5b49b12174f2220cf83978dbb2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.146989 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.146918 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9f2167486c68501ab6cd4222066784e7-config\") pod \"kube-apiserver-proxy-ip-10-0-134-187.ec2.internal\" (UID: \"9f2167486c68501ab6cd4222066784e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.146989 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.146948 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b69db5b49b12174f2220cf83978dbb2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal\" (UID: \"7b69db5b49b12174f2220cf83978dbb2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.242449 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.242426 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.305902 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.305886 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.311349 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.311334 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:27.343094 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.343072 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.443597 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.443548 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.544077 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.544057 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.644610 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.644588 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.648729 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.648715 2563 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:35:27.648848 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.648831 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:27.648885 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.648871 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:27.741711 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.741689 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:27.744650 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.744628 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.748356 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.748326 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:30:26 +0000 UTC" deadline="2027-12-26 04:24:15.351237761 +0000 UTC" Apr 23 16:35:27.748356 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.748354 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14675h48m47.602887755s" Apr 23 16:35:27.763472 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.763451 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:27.797997 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:27.797960 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f2167486c68501ab6cd4222066784e7.slice/crio-f166c9fe95200157d94e9998d16f4f578441ef2cada5f0c58c854519f0a349a6 WatchSource:0}: Error finding container f166c9fe95200157d94e9998d16f4f578441ef2cada5f0c58c854519f0a349a6: Status 404 returned error can't find the container with id f166c9fe95200157d94e9998d16f4f578441ef2cada5f0c58c854519f0a349a6 Apr 23 16:35:27.798378 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:27.798351 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b69db5b49b12174f2220cf83978dbb2.slice/crio-052bf5011d8cc4684378af8f6bc60ace038ca8f8dbd41bdbec6c9cf32dcbedde WatchSource:0}: Error finding container 052bf5011d8cc4684378af8f6bc60ace038ca8f8dbd41bdbec6c9cf32dcbedde: Status 404 returned error can't find the container with id 052bf5011d8cc4684378af8f6bc60ace038ca8f8dbd41bdbec6c9cf32dcbedde Apr 23 16:35:27.800347 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.800327 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rjj2z" Apr 23 16:35:27.802515 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.802500 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:35:27.813212 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.813195 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rjj2z" Apr 23 16:35:27.845297 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.845277 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:27.877119 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.877086 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" event={"ID":"9f2167486c68501ab6cd4222066784e7","Type":"ContainerStarted","Data":"f166c9fe95200157d94e9998d16f4f578441ef2cada5f0c58c854519f0a349a6"} Apr 23 16:35:27.878039 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:27.878020 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" event={"ID":"7b69db5b49b12174f2220cf83978dbb2","Type":"ContainerStarted","Data":"052bf5011d8cc4684378af8f6bc60ace038ca8f8dbd41bdbec6c9cf32dcbedde"} Apr 23 16:35:27.945469 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:27.945450 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-187.ec2.internal\" not found" Apr 23 16:35:28.028234 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.028215 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:28.041767 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.041748 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" Apr 23 16:35:28.053628 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.053609 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:28.055678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.055667 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" Apr 23 16:35:28.071742 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.071724 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:28.336615 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.336214 2563 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:28.484984 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.484954 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:28.531172 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.531147 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:28.722236 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.722161 2563 apiserver.go:52] "Watching apiserver" Apr 23 16:35:28.730560 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.730537 2563 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:35:28.732792 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.732767 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-w2bxv","openshift-image-registry/node-ca-9896x","openshift-multus/multus-additional-cni-plugins-4k96t","openshift-multus/multus-zghhl","openshift-network-diagnostics/network-check-target-pz92q","kube-system/konnectivity-agent-khsql","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal","openshift-multus/network-metrics-daemon-kpgxm","openshift-network-operator/iptables-alerter-qbf55","openshift-ovn-kubernetes/ovnkube-node-hc9pq","kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k","openshift-cluster-node-tuning-operator/tuned-rwc2z"] Apr 23 16:35:28.735285 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.735259 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:28.735427 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.735354 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.737034 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.736584 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.737968 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.737946 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.739407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.739321 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:28.739407 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:28.739387 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:28.740913 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.740778 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:35:28.741055 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.740935 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qhg9q\"" Apr 23 16:35:28.741055 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.740944 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.741055 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.740986 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.742125 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.741835 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:35:28.742125 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.742113 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:35:28.742931 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.742413 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xvnpc\"" Apr 23 16:35:28.742931 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.742687 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.742931 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.742877 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.742931 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.742920 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.743558 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.743540 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:35:28.744914 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.744821 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:28.744914 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:28.744886 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:28.745074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745001 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:35:28.745218 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745188 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:35:28.745305 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745221 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:35:28.745401 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745381 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:35:28.745504 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745487 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vq8nn\"" Apr 23 16:35:28.745665 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745650 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.745765 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745745 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:35:28.745837 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745819 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.745938 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.745923 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-n6tl9\"" Apr 23 16:35:28.747740 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.747450 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.748909 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.748892 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.749092 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.749048 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ccff8\"" Apr 23 16:35:28.749164 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.749054 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:35:28.750314 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.750296 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.750751 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.750730 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:35:28.751067 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.751050 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.751328 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.751107 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:35:28.751457 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.751144 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.751610 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.751181 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r7btr\"" Apr 23 16:35:28.751673 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.751652 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.752943 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.752925 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/765656a2-d2b1-490f-b1db-11ff6b259036-konnectivity-ca\") pod \"konnectivity-agent-khsql\" (UID: \"765656a2-d2b1-490f-b1db-11ff6b259036\") " pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:28.753028 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.752958 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cnibin\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.753028 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.752983 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-system-cni-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753028 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753007 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-cni-bin\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753129 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753046 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-kubelet\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753129 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753079 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753129 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753106 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-kubelet\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753213 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753129 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-hostroot\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753213 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753153 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wlml\" (UniqueName: \"kubernetes.io/projected/47257c1b-a9bb-4228-abc5-2ba95fa73db4-kube-api-access-7wlml\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753213 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753180 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6znvg\" (UniqueName: \"kubernetes.io/projected/c1dab98e-8f79-4056-94f4-9185da61ca34-kube-api-access-6znvg\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:28.753332 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753204 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-device-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.753332 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753298 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-system-cni-dir\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.753332 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753327 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-cnibin\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753416 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753349 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-run-ovn-kubernetes\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753416 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753374 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:28.753416 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753396 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/765656a2-d2b1-490f-b1db-11ff6b259036-agent-certs\") pod \"konnectivity-agent-khsql\" (UID: \"765656a2-d2b1-490f-b1db-11ff6b259036\") " pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:28.753502 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753419 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-slash\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753502 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753442 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/51af9790-dfdc-4e37-824c-072fa2141017-host-slash\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.753502 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753469 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-conf-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753502 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753492 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-daemon-config\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753620 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753515 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-os-release\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.753620 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753540 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-cni-multus\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753620 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753563 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-multus-certs\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753620 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753594 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753732 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753619 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spd6l\" (UniqueName: \"kubernetes.io/projected/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-kube-api-access-spd6l\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753732 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753646 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-k8s-cni-cncf-io\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.753732 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753680 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-sys-fs\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.753732 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753708 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-run-netns\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753838 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753744 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-systemd\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753838 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753767 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-log-socket\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753838 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753794 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-cni-bin\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.753838 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753816 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-os-release\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.754003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753844 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:28.754003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753866 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovn-node-metrics-cert\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753902 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.754003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753929 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-socket-dir-parent\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.754003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753958 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-registration-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.754003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.753984 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv77l\" (UniqueName: \"kubernetes.io/projected/cd9ca4f8-3553-4177-ac35-7fc759b3a137-kube-api-access-qv77l\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754009 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-etc-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754047 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754074 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/51af9790-dfdc-4e37-824c-072fa2141017-iptables-alerter-script\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754101 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-netns\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754136 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-systemd-units\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754161 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-node-log\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754186 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-cni-netd\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754217 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovnkube-config\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754258 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47257c1b-a9bb-4228-abc5-2ba95fa73db4-cni-binary-copy\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.754303 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754283 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754305 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-env-overrides\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754348 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25xqk\" (UniqueName: \"kubernetes.io/projected/9db0f82d-4208-44c8-a818-ed7fcbd374fa-kube-api-access-25xqk\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754355 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754386 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754374 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-var-lib-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754458 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7946s\" (UniqueName: \"kubernetes.io/projected/51af9790-dfdc-4e37-824c-072fa2141017-kube-api-access-7946s\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754542 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-cni-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754572 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754605 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovnkube-script-lib\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754653 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-socket-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754690 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-etc-selinux\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754720 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-ovn\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754745 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.754769 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754770 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.755322 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.754792 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-etc-kubernetes\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.757599 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.757575 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.757715 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.757697 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.757782 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.757737 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-285c2\"" Apr 23 16:35:28.758058 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.758039 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qcf7r\"" Apr 23 16:35:28.758140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.758129 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wl7qd\"" Apr 23 16:35:28.758753 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.758738 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.759220 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.759201 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.813778 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.813746 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:27 +0000 UTC" deadline="2027-12-18 10:28:04.600590038 +0000 UTC" Apr 23 16:35:28.813778 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.813775 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14489h52m35.786818561s" Apr 23 16:35:28.845204 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.845181 2563 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:35:28.855906 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.855882 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-run-netns\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856010 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.855972 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-run-netns\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856010 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.855998 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-systemd\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856120 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856032 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9kd6\" (UniqueName: \"kubernetes.io/projected/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-kube-api-access-c9kd6\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.856120 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856058 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-host\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.856120 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856083 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-log-socket\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856120 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856097 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-systemd\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856120 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856109 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-cni-bin\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856132 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-os-release\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856135 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-log-socket\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856158 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysconfig\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856171 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-cni-bin\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856183 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysctl-d\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856207 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x8h4\" (UniqueName: \"kubernetes.io/projected/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-kube-api-access-5x8h4\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856206 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-os-release\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856252 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856292 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovn-node-metrics-cert\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856308 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856322 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-socket-dir-parent\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.856369 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:28.856355 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856380 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-socket-dir-parent\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:28.856438 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs podName:c1dab98e-8f79-4056-94f4-9185da61ca34 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.356418155 +0000 UTC m=+3.088172580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs") pod "network-metrics-daemon-kpgxm" (UID: "c1dab98e-8f79-4056-94f4-9185da61ca34") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856460 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-var-lib-kubelet\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856486 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-hosts-file\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856487 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856521 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-registration-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856566 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qv77l\" (UniqueName: \"kubernetes.io/projected/cd9ca4f8-3553-4177-ac35-7fc759b3a137-kube-api-access-qv77l\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856612 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-etc-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856641 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856647 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-registration-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856670 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/51af9790-dfdc-4e37-824c-072fa2141017-iptables-alerter-script\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856643 2563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856708 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-netns\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856735 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-systemd-units\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856740 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-netns\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.856904 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856709 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-etc-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856759 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-node-log\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856786 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-systemd-units\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856787 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-cni-netd\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856815 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovnkube-config\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856817 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-cni-netd\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856837 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-node-log\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856845 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47257c1b-a9bb-4228-abc5-2ba95fa73db4-cni-binary-copy\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856893 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-kubernetes\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856923 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-systemd\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856946 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856971 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-env-overrides\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.856999 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25xqk\" (UniqueName: \"kubernetes.io/projected/9db0f82d-4208-44c8-a818-ed7fcbd374fa-kube-api-access-25xqk\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857031 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-host\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857055 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857102 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-var-lib-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857245 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/51af9790-dfdc-4e37-824c-072fa2141017-iptables-alerter-script\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.857652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857259 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857062 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-var-lib-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857315 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7946s\" (UniqueName: \"kubernetes.io/projected/51af9790-dfdc-4e37-824c-072fa2141017-kube-api-access-7946s\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857363 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-cni-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857442 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovnkube-config\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857442 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47257c1b-a9bb-4228-abc5-2ba95fa73db4-cni-binary-copy\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857458 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-env-overrides\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857490 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-lib-modules\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857515 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-cni-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857517 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-serviceca\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857567 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovnkube-script-lib\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857599 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-socket-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857624 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-etc-selinux\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857741 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-etc-selinux\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857746 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-socket-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857782 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-ovn\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857817 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.858456 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857844 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857871 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-ovn\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857875 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-etc-kubernetes\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857908 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysctl-conf\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857909 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-etc-kubernetes\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857924 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-tuned\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857942 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/765656a2-d2b1-490f-b1db-11ff6b259036-konnectivity-ca\") pod \"konnectivity-agent-khsql\" (UID: \"765656a2-d2b1-490f-b1db-11ff6b259036\") " pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857957 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cnibin\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857974 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-system-cni-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.857988 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-cni-bin\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858026 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-kubelet\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858049 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858073 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-kubelet\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858086 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovnkube-script-lib\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858097 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-hostroot\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858118 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wlml\" (UniqueName: \"kubernetes.io/projected/47257c1b-a9bb-4228-abc5-2ba95fa73db4-kube-api-access-7wlml\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858142 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-tmp\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858168 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6znvg\" (UniqueName: \"kubernetes.io/projected/c1dab98e-8f79-4056-94f4-9185da61ca34-kube-api-access-6znvg\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:28.859074 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858173 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cnibin\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858192 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-device-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858213 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-system-cni-dir\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858219 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-system-cni-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858247 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-cnibin\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858278 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-run\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858296 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-sys\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858314 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-tmp-dir\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858329 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-run-ovn-kubernetes\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858347 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858371 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/765656a2-d2b1-490f-b1db-11ff6b259036-agent-certs\") pod \"konnectivity-agent-khsql\" (UID: \"765656a2-d2b1-490f-b1db-11ff6b259036\") " pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858401 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-slash\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858427 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/51af9790-dfdc-4e37-824c-072fa2141017-host-slash\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858453 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-conf-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858477 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-daemon-config\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858496 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-system-cni-dir\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858505 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfsw9\" (UniqueName: \"kubernetes.io/projected/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-kube-api-access-gfsw9\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.859878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858534 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-run-openvswitch\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858534 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-os-release\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858570 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-cni-multus\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858604 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-multus-certs\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858610 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9db0f82d-4208-44c8-a818-ed7fcbd374fa-os-release\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858626 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858658 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858663 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spd6l\" (UniqueName: \"kubernetes.io/projected/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-kube-api-access-spd6l\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858691 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-k8s-cni-cncf-io\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858720 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-modprobe-d\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858753 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-sys-fs\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858844 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-sys-fs\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858901 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-cnibin\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858277 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-cni-bin\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858982 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-run-ovn-kubernetes\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859026 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.858143 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-kubelet\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.860678 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859111 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-kubelet\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859149 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-hostroot\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859145 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9db0f82d-4208-44c8-a818-ed7fcbd374fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859310 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd9ca4f8-3553-4177-ac35-7fc759b3a137-device-dir\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859359 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-var-lib-cni-multus\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859370 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-conf-dir\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859399 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-host-slash\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859417 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/51af9790-dfdc-4e37-824c-072fa2141017-host-slash\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859555 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/47257c1b-a9bb-4228-abc5-2ba95fa73db4-multus-daemon-config\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859567 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-k8s-cni-cncf-io\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.859599 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/47257c1b-a9bb-4228-abc5-2ba95fa73db4-host-run-multus-certs\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.861493 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.860203 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-ovn-node-metrics-cert\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.862482 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.862466 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/765656a2-d2b1-490f-b1db-11ff6b259036-agent-certs\") pod \"konnectivity-agent-khsql\" (UID: \"765656a2-d2b1-490f-b1db-11ff6b259036\") " pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:28.866217 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.866192 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/765656a2-d2b1-490f-b1db-11ff6b259036-konnectivity-ca\") pod \"konnectivity-agent-khsql\" (UID: \"765656a2-d2b1-490f-b1db-11ff6b259036\") " pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:28.874026 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:28.873943 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:28.874026 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:28.873970 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:28.874026 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:28.873986 2563 projected.go:194] Error preparing data for projected volume kube-api-access-jq84s for pod openshift-network-diagnostics/network-check-target-pz92q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:28.874238 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:28.874046 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s podName:92efbb3d-8bd0-413e-b306-331d80df0505 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.374028833 +0000 UTC m=+3.105783248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jq84s" (UniqueName: "kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s") pod "network-check-target-pz92q" (UID: "92efbb3d-8bd0-413e-b306-331d80df0505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:28.876979 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.876648 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7946s\" (UniqueName: \"kubernetes.io/projected/51af9790-dfdc-4e37-824c-072fa2141017-kube-api-access-7946s\") pod \"iptables-alerter-qbf55\" (UID: \"51af9790-dfdc-4e37-824c-072fa2141017\") " pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:28.876979 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.876890 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wlml\" (UniqueName: \"kubernetes.io/projected/47257c1b-a9bb-4228-abc5-2ba95fa73db4-kube-api-access-7wlml\") pod \"multus-zghhl\" (UID: \"47257c1b-a9bb-4228-abc5-2ba95fa73db4\") " pod="openshift-multus/multus-zghhl" Apr 23 16:35:28.877934 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.877909 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25xqk\" (UniqueName: \"kubernetes.io/projected/9db0f82d-4208-44c8-a818-ed7fcbd374fa-kube-api-access-25xqk\") pod \"multus-additional-cni-plugins-4k96t\" (UID: \"9db0f82d-4208-44c8-a818-ed7fcbd374fa\") " pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:28.878256 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.878221 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spd6l\" (UniqueName: \"kubernetes.io/projected/2f90e3aa-3501-4d70-8aed-0b0959ac4c07-kube-api-access-spd6l\") pod \"ovnkube-node-hc9pq\" (UID: \"2f90e3aa-3501-4d70-8aed-0b0959ac4c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:28.878768 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.878728 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv77l\" (UniqueName: \"kubernetes.io/projected/cd9ca4f8-3553-4177-ac35-7fc759b3a137-kube-api-access-qv77l\") pod \"aws-ebs-csi-driver-node-w7f7k\" (UID: \"cd9ca4f8-3553-4177-ac35-7fc759b3a137\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:28.879986 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.879967 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6znvg\" (UniqueName: \"kubernetes.io/projected/c1dab98e-8f79-4056-94f4-9185da61ca34-kube-api-access-6znvg\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:28.959812 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959703 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-var-lib-kubelet\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.959812 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959739 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-hosts-file\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.959812 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959768 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-var-lib-kubelet\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.959812 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959781 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-kubernetes\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.959812 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959810 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-systemd\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959834 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-host\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959857 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-lib-modules\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959879 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-serviceca\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959908 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysctl-conf\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959931 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-tuned\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959936 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-systemd\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959958 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-tmp\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.959983 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-run\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960006 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-sys\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960029 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-tmp-dir\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960058 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-hosts-file\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960069 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfsw9\" (UniqueName: \"kubernetes.io/projected/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-kube-api-access-gfsw9\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960102 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-kubernetes\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960100 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-modprobe-d\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960140 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9kd6\" (UniqueName: \"kubernetes.io/projected/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-kube-api-access-c9kd6\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960162 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-host\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.960804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960189 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysconfig\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960244 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysctl-d\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960270 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x8h4\" (UniqueName: \"kubernetes.io/projected/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-kube-api-access-5x8h4\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.960804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960610 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-host\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.960804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960642 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-modprobe-d\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960653 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-lib-modules\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.960804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960702 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-host\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.961131 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960862 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysconfig\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.961131 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960910 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-run\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.961131 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960919 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-sys\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.961131 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.960969 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-tmp-dir\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.961131 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.961030 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysctl-d\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.961131 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.961032 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-sysctl-conf\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.961408 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.961197 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-serviceca\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.962418 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.962401 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-etc-tuned\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.962848 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.962828 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-tmp\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:28.975800 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.975087 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x8h4\" (UniqueName: \"kubernetes.io/projected/ffb20d28-4839-4bfe-aa6f-83380eb3d9be-kube-api-access-5x8h4\") pod \"node-resolver-w2bxv\" (UID: \"ffb20d28-4839-4bfe-aa6f-83380eb3d9be\") " pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:28.976179 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.976157 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfsw9\" (UniqueName: \"kubernetes.io/projected/455d942e-c133-4e0e-9c9c-c8f16c4d5e30-kube-api-access-gfsw9\") pod \"node-ca-9896x\" (UID: \"455d942e-c133-4e0e-9c9c-c8f16c4d5e30\") " pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:28.978436 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:28.978417 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9kd6\" (UniqueName: \"kubernetes.io/projected/b2efb7ba-37a4-4a47-8ad7-95d2a587efd4-kube-api-access-c9kd6\") pod \"tuned-rwc2z\" (UID: \"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4\") " pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:29.047341 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.047318 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:29.054996 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.054966 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" Apr 23 16:35:29.063380 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.063361 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:29.069876 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.069859 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zghhl" Apr 23 16:35:29.075372 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.075358 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4k96t" Apr 23 16:35:29.081869 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.081852 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qbf55" Apr 23 16:35:29.088354 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.088337 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w2bxv" Apr 23 16:35:29.095813 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.095791 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9896x" Apr 23 16:35:29.100281 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.100265 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" Apr 23 16:35:29.361865 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.361820 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:29.362048 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:29.361950 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:29.362048 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:29.362021 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs podName:c1dab98e-8f79-4056-94f4-9185da61ca34 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:30.362000894 +0000 UTC m=+4.093755312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs") pod "network-metrics-daemon-kpgxm" (UID: "c1dab98e-8f79-4056-94f4-9185da61ca34") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:29.413657 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.413621 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f90e3aa_3501_4d70_8aed_0b0959ac4c07.slice/crio-5460d0884f88c9d688560c8c965a9bd13e4e6b2c4f42ba9af3af6a6f673deb0e WatchSource:0}: Error finding container 5460d0884f88c9d688560c8c965a9bd13e4e6b2c4f42ba9af3af6a6f673deb0e: Status 404 returned error can't find the container with id 5460d0884f88c9d688560c8c965a9bd13e4e6b2c4f42ba9af3af6a6f673deb0e Apr 23 16:35:29.417027 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.417001 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod455d942e_c133_4e0e_9c9c_c8f16c4d5e30.slice/crio-b1687890ad09ed79360aac025006ef7a98aecbd7f3ff4f874d2bea1846a6735c WatchSource:0}: Error finding container b1687890ad09ed79360aac025006ef7a98aecbd7f3ff4f874d2bea1846a6735c: Status 404 returned error can't find the container with id b1687890ad09ed79360aac025006ef7a98aecbd7f3ff4f874d2bea1846a6735c Apr 23 16:35:29.417770 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.417688 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47257c1b_a9bb_4228_abc5_2ba95fa73db4.slice/crio-2aab536e10724bf0aeff1e151009a2d52dc85e402570a4ef4a1de7e3a3b13894 WatchSource:0}: Error finding container 2aab536e10724bf0aeff1e151009a2d52dc85e402570a4ef4a1de7e3a3b13894: Status 404 returned error can't find the container with id 2aab536e10724bf0aeff1e151009a2d52dc85e402570a4ef4a1de7e3a3b13894 Apr 23 16:35:29.420721 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.420607 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51af9790_dfdc_4e37_824c_072fa2141017.slice/crio-e7bcf66b2f0ee6412c5ecf4f78857f58ae4f3ed1e026bbfbcec82996f19504f4 WatchSource:0}: Error finding container e7bcf66b2f0ee6412c5ecf4f78857f58ae4f3ed1e026bbfbcec82996f19504f4: Status 404 returned error can't find the container with id e7bcf66b2f0ee6412c5ecf4f78857f58ae4f3ed1e026bbfbcec82996f19504f4 Apr 23 16:35:29.421366 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.421335 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765656a2_d2b1_490f_b1db_11ff6b259036.slice/crio-0eb71a96a71302ce625aefed21147356d0d7ac3d463f58f5d387e1cc377b07ec WatchSource:0}: Error finding container 0eb71a96a71302ce625aefed21147356d0d7ac3d463f58f5d387e1cc377b07ec: Status 404 returned error can't find the container with id 0eb71a96a71302ce625aefed21147356d0d7ac3d463f58f5d387e1cc377b07ec Apr 23 16:35:29.422434 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.422414 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffb20d28_4839_4bfe_aa6f_83380eb3d9be.slice/crio-a094395a03fa996dd20a10930939a16616bfecc6a61f911db5158d70f302b575 WatchSource:0}: Error finding container a094395a03fa996dd20a10930939a16616bfecc6a61f911db5158d70f302b575: Status 404 returned error can't find the container with id a094395a03fa996dd20a10930939a16616bfecc6a61f911db5158d70f302b575 Apr 23 16:35:29.423704 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.423337 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2efb7ba_37a4_4a47_8ad7_95d2a587efd4.slice/crio-04ed9c88a73e4498476e4129df3122ecb01552880ab3bc0c46aaed443aeb756c WatchSource:0}: Error finding container 04ed9c88a73e4498476e4129df3122ecb01552880ab3bc0c46aaed443aeb756c: Status 404 returned error can't find the container with id 04ed9c88a73e4498476e4129df3122ecb01552880ab3bc0c46aaed443aeb756c Apr 23 16:35:29.424537 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.424443 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db0f82d_4208_44c8_a818_ed7fcbd374fa.slice/crio-4b7cb8405f9329e6a8b37c7a41887b2835ab8a7296f8cdfe09c36608c90d1e1d WatchSource:0}: Error finding container 4b7cb8405f9329e6a8b37c7a41887b2835ab8a7296f8cdfe09c36608c90d1e1d: Status 404 returned error can't find the container with id 4b7cb8405f9329e6a8b37c7a41887b2835ab8a7296f8cdfe09c36608c90d1e1d Apr 23 16:35:29.426272 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:35:29.425715 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd9ca4f8_3553_4177_ac35_7fc759b3a137.slice/crio-e9df90d775de39e8bfc77efb35e8187ecf4ec88960d202e9ef516ed69b137ded WatchSource:0}: Error finding container e9df90d775de39e8bfc77efb35e8187ecf4ec88960d202e9ef516ed69b137ded: Status 404 returned error can't find the container with id e9df90d775de39e8bfc77efb35e8187ecf4ec88960d202e9ef516ed69b137ded Apr 23 16:35:29.462190 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.462069 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:29.462297 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:29.462208 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:29.462297 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:29.462246 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:29.462297 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:29.462258 2563 projected.go:194] Error preparing data for projected volume kube-api-access-jq84s for pod openshift-network-diagnostics/network-check-target-pz92q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:29.462403 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:29.462302 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s podName:92efbb3d-8bd0-413e-b306-331d80df0505 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:30.462287172 +0000 UTC m=+4.194041583 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq84s" (UniqueName: "kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s") pod "network-check-target-pz92q" (UID: "92efbb3d-8bd0-413e-b306-331d80df0505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:29.815740 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.814520 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:27 +0000 UTC" deadline="2028-01-29 09:58:45.46011174 +0000 UTC" Apr 23 16:35:29.815740 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.814559 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15497h23m15.645556793s" Apr 23 16:35:29.884773 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.884708 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" event={"ID":"cd9ca4f8-3553-4177-ac35-7fc759b3a137","Type":"ContainerStarted","Data":"e9df90d775de39e8bfc77efb35e8187ecf4ec88960d202e9ef516ed69b137ded"} Apr 23 16:35:29.888923 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.888865 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerStarted","Data":"4b7cb8405f9329e6a8b37c7a41887b2835ab8a7296f8cdfe09c36608c90d1e1d"} Apr 23 16:35:29.891855 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.891800 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qbf55" event={"ID":"51af9790-dfdc-4e37-824c-072fa2141017","Type":"ContainerStarted","Data":"e7bcf66b2f0ee6412c5ecf4f78857f58ae4f3ed1e026bbfbcec82996f19504f4"} Apr 23 16:35:29.895254 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.895059 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zghhl" event={"ID":"47257c1b-a9bb-4228-abc5-2ba95fa73db4","Type":"ContainerStarted","Data":"2aab536e10724bf0aeff1e151009a2d52dc85e402570a4ef4a1de7e3a3b13894"} Apr 23 16:35:29.896876 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.896855 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9896x" event={"ID":"455d942e-c133-4e0e-9c9c-c8f16c4d5e30","Type":"ContainerStarted","Data":"b1687890ad09ed79360aac025006ef7a98aecbd7f3ff4f874d2bea1846a6735c"} Apr 23 16:35:29.900601 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.900558 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" event={"ID":"9f2167486c68501ab6cd4222066784e7","Type":"ContainerStarted","Data":"2b4fc8542c0f74daa25089d16feb36d82959257ba7ae8216a03be7c14ab663cb"} Apr 23 16:35:29.906025 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.905969 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" event={"ID":"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4","Type":"ContainerStarted","Data":"04ed9c88a73e4498476e4129df3122ecb01552880ab3bc0c46aaed443aeb756c"} Apr 23 16:35:29.908035 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.907994 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w2bxv" event={"ID":"ffb20d28-4839-4bfe-aa6f-83380eb3d9be","Type":"ContainerStarted","Data":"a094395a03fa996dd20a10930939a16616bfecc6a61f911db5158d70f302b575"} Apr 23 16:35:29.925591 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.925559 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-khsql" event={"ID":"765656a2-d2b1-490f-b1db-11ff6b259036","Type":"ContainerStarted","Data":"0eb71a96a71302ce625aefed21147356d0d7ac3d463f58f5d387e1cc377b07ec"} Apr 23 16:35:29.929295 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:29.929265 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"5460d0884f88c9d688560c8c965a9bd13e4e6b2c4f42ba9af3af6a6f673deb0e"} Apr 23 16:35:30.369483 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:30.369455 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:30.369606 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:30.369587 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:30.369664 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:30.369650 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs podName:c1dab98e-8f79-4056-94f4-9185da61ca34 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:32.369630164 +0000 UTC m=+6.101384579 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs") pod "network-metrics-daemon-kpgxm" (UID: "c1dab98e-8f79-4056-94f4-9185da61ca34") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:30.470247 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:30.470200 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:30.470405 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:30.470367 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:30.470405 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:30.470392 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:30.470405 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:30.470405 2563 projected.go:194] Error preparing data for projected volume kube-api-access-jq84s for pod openshift-network-diagnostics/network-check-target-pz92q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:30.470564 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:30.470460 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s podName:92efbb3d-8bd0-413e-b306-331d80df0505 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:32.470440599 +0000 UTC m=+6.202195027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq84s" (UniqueName: "kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s") pod "network-check-target-pz92q" (UID: "92efbb3d-8bd0-413e-b306-331d80df0505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:30.877930 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:30.877900 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:30.878384 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:30.878086 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:30.878545 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:30.878523 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:30.878664 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:30.878643 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:30.934576 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:30.934529 2563 generic.go:358] "Generic (PLEG): container finished" podID="7b69db5b49b12174f2220cf83978dbb2" containerID="1b11e8e1eb3fc1e6adedf00c00f137f86e48b6f1e16e7ee4abdf95e3f7cadad9" exitCode=0 Apr 23 16:35:30.935389 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:30.935092 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" event={"ID":"7b69db5b49b12174f2220cf83978dbb2","Type":"ContainerDied","Data":"1b11e8e1eb3fc1e6adedf00c00f137f86e48b6f1e16e7ee4abdf95e3f7cadad9"} Apr 23 16:35:30.952369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:30.952135 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-187.ec2.internal" podStartSLOduration=2.952120234 podStartE2EDuration="2.952120234s" podCreationTimestamp="2026-04-23 16:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:29.925460215 +0000 UTC m=+3.657214650" watchObservedRunningTime="2026-04-23 16:35:30.952120234 +0000 UTC m=+4.683874680" Apr 23 16:35:31.940804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:31.940767 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" event={"ID":"7b69db5b49b12174f2220cf83978dbb2","Type":"ContainerStarted","Data":"d2a1bb023fdfcb2bd89db38bc57f82be8abc01bb87c953b77b28010449a01831"} Apr 23 16:35:32.384511 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:32.384459 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:32.384687 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:32.384607 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:32.384687 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:32.384670 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs podName:c1dab98e-8f79-4056-94f4-9185da61ca34 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:36.384651659 +0000 UTC m=+10.116406077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs") pod "network-metrics-daemon-kpgxm" (UID: "c1dab98e-8f79-4056-94f4-9185da61ca34") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:32.485818 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:32.485170 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:32.485818 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:32.485380 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:32.485818 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:32.485405 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:32.485818 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:32.485420 2563 projected.go:194] Error preparing data for projected volume kube-api-access-jq84s for pod openshift-network-diagnostics/network-check-target-pz92q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:32.485818 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:32.485482 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s podName:92efbb3d-8bd0-413e-b306-331d80df0505 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:36.485460869 +0000 UTC m=+10.217215287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq84s" (UniqueName: "kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s") pod "network-check-target-pz92q" (UID: "92efbb3d-8bd0-413e-b306-331d80df0505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:32.877031 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:32.876382 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:32.877031 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:32.876407 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:32.877031 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:32.876532 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:32.877031 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:32.876978 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:34.875969 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:34.875932 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:34.876424 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:34.876053 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:34.876553 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:34.876527 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:34.876680 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:34.876624 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:36.416141 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:36.416103 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:36.416658 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:36.416279 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:36.416658 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:36.416344 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs podName:c1dab98e-8f79-4056-94f4-9185da61ca34 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:44.416325027 +0000 UTC m=+18.148079451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs") pod "network-metrics-daemon-kpgxm" (UID: "c1dab98e-8f79-4056-94f4-9185da61ca34") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:36.516807 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:36.516776 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:36.516968 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:36.516946 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:36.516968 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:36.516965 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:36.517080 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:36.516978 2563 projected.go:194] Error preparing data for projected volume kube-api-access-jq84s for pod openshift-network-diagnostics/network-check-target-pz92q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:36.517080 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:36.517033 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s podName:92efbb3d-8bd0-413e-b306-331d80df0505 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:44.517015478 +0000 UTC m=+18.248769894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq84s" (UniqueName: "kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s") pod "network-check-target-pz92q" (UID: "92efbb3d-8bd0-413e-b306-331d80df0505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:36.876866 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:36.876261 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:36.876866 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:36.876371 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:36.876866 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:36.876721 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:36.876866 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:36.876824 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:38.875522 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:38.875484 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:38.875970 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:38.875490 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:38.875970 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:38.875629 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:38.875970 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:38.875685 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:40.875140 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:40.875051 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:40.875554 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:40.875182 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:40.875554 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:40.875297 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:40.875554 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:40.875413 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:42.875507 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:42.875474 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:42.875507 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:42.875493 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:42.875996 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:42.875588 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:42.875996 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:42.875742 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:44.477316 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:44.477275 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:44.477765 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:44.477410 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:44.477765 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:44.477474 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs podName:c1dab98e-8f79-4056-94f4-9185da61ca34 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:00.477455561 +0000 UTC m=+34.209209987 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs") pod "network-metrics-daemon-kpgxm" (UID: "c1dab98e-8f79-4056-94f4-9185da61ca34") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:44.578540 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:44.578512 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:44.578702 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:44.578637 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:44.578702 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:44.578654 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:44.578702 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:44.578664 2563 projected.go:194] Error preparing data for projected volume kube-api-access-jq84s for pod openshift-network-diagnostics/network-check-target-pz92q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:44.578828 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:44.578713 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s podName:92efbb3d-8bd0-413e-b306-331d80df0505 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:00.57869585 +0000 UTC m=+34.310450275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq84s" (UniqueName: "kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s") pod "network-check-target-pz92q" (UID: "92efbb3d-8bd0-413e-b306-331d80df0505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:44.875631 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:44.875548 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:44.875773 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:44.875678 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:44.875773 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:44.875738 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:44.875906 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:44.875871 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:46.876667 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.876473 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:46.877088 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.876558 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:46.877088 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:46.876748 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:46.877088 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:46.876822 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:46.965466 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.965437 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" event={"ID":"b2efb7ba-37a4-4a47-8ad7-95d2a587efd4","Type":"ContainerStarted","Data":"0e961a3e026161698a344fccf5d18535c65e03e840f591a61942c7cf93b2cecc"} Apr 23 16:35:46.966895 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.966867 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w2bxv" event={"ID":"ffb20d28-4839-4bfe-aa6f-83380eb3d9be","Type":"ContainerStarted","Data":"c0c19aedef1fbc486badd9c56af1067c17d0cdffe39d2fa4ba87be4a05c9299c"} Apr 23 16:35:46.968348 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.968320 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-khsql" event={"ID":"765656a2-d2b1-490f-b1db-11ff6b259036","Type":"ContainerStarted","Data":"c3610203ab7b6d08743712dd24d2be3ba801866a495ea49976b817bf905debdd"} Apr 23 16:35:46.973563 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.973531 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:35:46.973853 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.973826 2563 generic.go:358] "Generic (PLEG): container finished" podID="2f90e3aa-3501-4d70-8aed-0b0959ac4c07" containerID="b8b6a0932975abc4d15b7573cef8b875577ea9256a04162015aae64e969cebc0" exitCode=1 Apr 23 16:35:46.973932 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.973894 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"2a7ba3945f6c18962c34b9d112084c27b247ddf020be444a185e27822e98ca35"} Apr 23 16:35:46.973932 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.973924 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerDied","Data":"b8b6a0932975abc4d15b7573cef8b875577ea9256a04162015aae64e969cebc0"} Apr 23 16:35:46.974071 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.973941 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"9a8550308f8d29e2c0d15db4fd0358fcc3cc0c8adc43cb79c6385a1c30d6d722"} Apr 23 16:35:46.975370 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.975350 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" event={"ID":"cd9ca4f8-3553-4177-ac35-7fc759b3a137","Type":"ContainerStarted","Data":"da073c0ef8ff6c82631602dcafd71e093b788b78e1c62715ff6f9f8af1afebed"} Apr 23 16:35:46.976727 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.976704 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerStarted","Data":"7ade0e8e35c475c0db33ab7458c1ea7c31158916fe9c8a822c73250577a06af2"} Apr 23 16:35:46.978286 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.978258 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zghhl" event={"ID":"47257c1b-a9bb-4228-abc5-2ba95fa73db4","Type":"ContainerStarted","Data":"6c07ced8fe14630faaea40d08e87003842e4850da56da29351662a7776f14298"} Apr 23 16:35:46.979763 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.979743 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9896x" event={"ID":"455d942e-c133-4e0e-9c9c-c8f16c4d5e30","Type":"ContainerStarted","Data":"7949634ca42c0751ebd3552d9aa4587aa3237dafd92edd3fea88e43b04943bc0"} Apr 23 16:35:46.989728 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.989691 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-187.ec2.internal" podStartSLOduration=18.989666366 podStartE2EDuration="18.989666366s" podCreationTimestamp="2026-04-23 16:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:31.973183551 +0000 UTC m=+5.704937986" watchObservedRunningTime="2026-04-23 16:35:46.989666366 +0000 UTC m=+20.721420790" Apr 23 16:35:46.990066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:46.990041 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rwc2z" podStartSLOduration=2.943512774 podStartE2EDuration="19.990035412s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.425534814 +0000 UTC m=+3.157289224" lastFinishedPulling="2026-04-23 16:35:46.472057436 +0000 UTC m=+20.203811862" observedRunningTime="2026-04-23 16:35:46.988611156 +0000 UTC m=+20.720365589" watchObservedRunningTime="2026-04-23 16:35:46.990035412 +0000 UTC m=+20.721789884" Apr 23 16:35:47.042950 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.042911 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zghhl" podStartSLOduration=2.966281871 podStartE2EDuration="20.04289741s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.419625843 +0000 UTC m=+3.151380257" lastFinishedPulling="2026-04-23 16:35:46.496241374 +0000 UTC m=+20.227995796" observedRunningTime="2026-04-23 16:35:47.042678432 +0000 UTC m=+20.774432859" watchObservedRunningTime="2026-04-23 16:35:47.04289741 +0000 UTC m=+20.774651843" Apr 23 16:35:47.043048 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.043031 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-khsql" podStartSLOduration=4.014337932 podStartE2EDuration="21.043026978s" podCreationTimestamp="2026-04-23 16:35:26 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.423251367 +0000 UTC m=+3.155005777" lastFinishedPulling="2026-04-23 16:35:46.451940405 +0000 UTC m=+20.183694823" observedRunningTime="2026-04-23 16:35:47.008292567 +0000 UTC m=+20.740047001" watchObservedRunningTime="2026-04-23 16:35:47.043026978 +0000 UTC m=+20.774781411" Apr 23 16:35:47.067735 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.067682 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w2bxv" podStartSLOduration=3.040359329 podStartE2EDuration="20.067662907s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.424210624 +0000 UTC m=+3.155965049" lastFinishedPulling="2026-04-23 16:35:46.451514199 +0000 UTC m=+20.183268627" observedRunningTime="2026-04-23 16:35:47.067342122 +0000 UTC m=+20.799096554" watchObservedRunningTime="2026-04-23 16:35:47.067662907 +0000 UTC m=+20.799417342" Apr 23 16:35:47.091076 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.090878 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9896x" podStartSLOduration=11.178181338 podStartE2EDuration="20.090858781s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.419150628 +0000 UTC m=+3.150905042" lastFinishedPulling="2026-04-23 16:35:38.331828074 +0000 UTC m=+12.063582485" observedRunningTime="2026-04-23 16:35:47.090523905 +0000 UTC m=+20.822278339" watchObservedRunningTime="2026-04-23 16:35:47.090858781 +0000 UTC m=+20.822613243" Apr 23 16:35:47.982726 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.982569 2563 generic.go:358] "Generic (PLEG): container finished" podID="9db0f82d-4208-44c8-a818-ed7fcbd374fa" containerID="7ade0e8e35c475c0db33ab7458c1ea7c31158916fe9c8a822c73250577a06af2" exitCode=0 Apr 23 16:35:47.982726 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.982660 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerDied","Data":"7ade0e8e35c475c0db33ab7458c1ea7c31158916fe9c8a822c73250577a06af2"} Apr 23 16:35:47.984908 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.984879 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qbf55" event={"ID":"51af9790-dfdc-4e37-824c-072fa2141017","Type":"ContainerStarted","Data":"9780b7afe16f178e20e2fc42e9094506892d8956163adb505f10a3b8b0304aa0"} Apr 23 16:35:47.987911 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.987889 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:35:47.988342 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.988263 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"d5e4cb514ed60e31199efdcc6a4dff7ca317b1a179dc8dff3084929e21e8870f"} Apr 23 16:35:47.988342 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.988294 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"5409f7bfc1aac264a7f4379612ab2dd4d16db4b7150556c27a282f357597ac21"} Apr 23 16:35:47.988342 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:47.988307 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"3a1742d9145e28d527eaaf6bfeb2658584b3775046a929a4165aef2d5b9e413f"} Apr 23 16:35:48.031882 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:48.031837 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qbf55" podStartSLOduration=3.983992831 podStartE2EDuration="21.03182359s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.422379703 +0000 UTC m=+3.154134116" lastFinishedPulling="2026-04-23 16:35:46.470210461 +0000 UTC m=+20.201964875" observedRunningTime="2026-04-23 16:35:48.031339015 +0000 UTC m=+21.763093447" watchObservedRunningTime="2026-04-23 16:35:48.03182359 +0000 UTC m=+21.763578023" Apr 23 16:35:48.087349 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:48.087324 2563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:48.820837 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:48.820749 2563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:48.087343306Z","UUID":"42ca3374-c848-4611-9e94-a22681c9996f","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:48.824691 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:48.824668 2563 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:48.824810 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:48.824701 2563 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:48.875260 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:48.875217 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:48.875399 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:48.875341 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:48.875610 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:48.875588 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:48.875731 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:48.875709 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:48.991606 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:48.991578 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" event={"ID":"cd9ca4f8-3553-4177-ac35-7fc759b3a137","Type":"ContainerStarted","Data":"c17309e4bb797855a34ff829cc0172a69820b8c89ee189497190cce354326dad"} Apr 23 16:35:49.996432 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:49.996221 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:35:49.996848 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:49.996822 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"3b82936cf1f46af68376d7c9f80ed98621fa7729450b474898611bd200e4bad0"} Apr 23 16:35:49.998759 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:49.998733 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" event={"ID":"cd9ca4f8-3553-4177-ac35-7fc759b3a137","Type":"ContainerStarted","Data":"95d06b49bdceff105db11025e7b74fb2e7029e9ada58cbe1e463849fb3a80c72"} Apr 23 16:35:50.020551 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:50.020513 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w7f7k" podStartSLOduration=4.15538938 podStartE2EDuration="24.020483884s" podCreationTimestamp="2026-04-23 16:35:26 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.427518356 +0000 UTC m=+3.159272767" lastFinishedPulling="2026-04-23 16:35:49.292612853 +0000 UTC m=+23.024367271" observedRunningTime="2026-04-23 16:35:50.020418852 +0000 UTC m=+23.752173285" watchObservedRunningTime="2026-04-23 16:35:50.020483884 +0000 UTC m=+23.752238322" Apr 23 16:35:50.881553 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:50.881526 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:50.881767 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:50.881526 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:50.881767 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:50.881636 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:50.881767 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:50.881694 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:51.570981 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:51.570945 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:51.571623 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:51.571597 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:52.005453 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.005330 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:35:52.005855 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.005827 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"64ff838f240d73d87d8ea47003a894f7b4db01446ff0a5ecfb122fece90535ca"} Apr 23 16:35:52.006278 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.006256 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:52.006370 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.006284 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:52.006449 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.006434 2563 scope.go:117] "RemoveContainer" containerID="b8b6a0932975abc4d15b7573cef8b875577ea9256a04162015aae64e969cebc0" Apr 23 16:35:52.022279 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.022264 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:52.432184 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.432119 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:52.432703 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.432684 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-khsql" Apr 23 16:35:52.874963 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.874935 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:52.875737 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:52.874935 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:52.875737 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:52.875079 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:52.875737 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:52.875113 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:53.009394 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:53.009360 2563 generic.go:358] "Generic (PLEG): container finished" podID="9db0f82d-4208-44c8-a818-ed7fcbd374fa" containerID="5e2a6bb4a182c2c5a5664648f426d4a703a9c7a657f215fde1e1562bd0d4ce52" exitCode=0 Apr 23 16:35:53.009500 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:53.009451 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerDied","Data":"5e2a6bb4a182c2c5a5664648f426d4a703a9c7a657f215fde1e1562bd0d4ce52"} Apr 23 16:35:53.012570 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:53.012555 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:35:53.012871 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:53.012854 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" event={"ID":"2f90e3aa-3501-4d70-8aed-0b0959ac4c07","Type":"ContainerStarted","Data":"e293b2a9638b089048b924f54c2f49615753b4853d9de0464e7bb34e050e16eb"} Apr 23 16:35:53.013128 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:53.013111 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:53.027568 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:53.027551 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:35:54.875591 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:54.875563 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:54.875928 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:54.875598 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:54.875928 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:54.875657 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:54.875928 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:54.875766 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:55.017441 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:55.017414 2563 generic.go:358] "Generic (PLEG): container finished" podID="9db0f82d-4208-44c8-a818-ed7fcbd374fa" containerID="af0c3cf7c3b028b5ce097f5ce91f635170466fb9c43f244c24414825bec2d23d" exitCode=0 Apr 23 16:35:55.017580 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:55.017501 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerDied","Data":"af0c3cf7c3b028b5ce097f5ce91f635170466fb9c43f244c24414825bec2d23d"} Apr 23 16:35:55.045729 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:55.043671 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" podStartSLOduration=10.917056307 podStartE2EDuration="28.043655315s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.415117583 +0000 UTC m=+3.146871994" lastFinishedPulling="2026-04-23 16:35:46.541716576 +0000 UTC m=+20.273471002" observedRunningTime="2026-04-23 16:35:53.080728112 +0000 UTC m=+26.812482545" watchObservedRunningTime="2026-04-23 16:35:55.043655315 +0000 UTC m=+28.775409748" Apr 23 16:35:56.875999 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:56.875973 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:56.876464 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:56.876074 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:56.876464 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:56.876147 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:56.876464 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:56.876256 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:35:57.022608 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:57.022572 2563 generic.go:358] "Generic (PLEG): container finished" podID="9db0f82d-4208-44c8-a818-ed7fcbd374fa" containerID="e2c6aa714749d7bc5c554fecaaa4c710a5fcead2707f8ef33561f4ece789379b" exitCode=0 Apr 23 16:35:57.022745 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:57.022640 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerDied","Data":"e2c6aa714749d7bc5c554fecaaa4c710a5fcead2707f8ef33561f4ece789379b"} Apr 23 16:35:58.875170 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:58.875135 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:35:58.875637 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:35:58.875172 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:35:58.875637 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:58.875276 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:35:58.875637 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:35:58.875390 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:36:00.502161 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:00.501965 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:00.502749 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:00.502084 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:36:00.502749 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:00.502257 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs podName:c1dab98e-8f79-4056-94f4-9185da61ca34 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:32.502241261 +0000 UTC m=+66.233995685 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs") pod "network-metrics-daemon-kpgxm" (UID: "c1dab98e-8f79-4056-94f4-9185da61ca34") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:36:00.602826 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:00.602798 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:00.602992 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:00.602933 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:36:00.602992 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:00.602954 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:36:00.602992 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:00.602966 2563 projected.go:194] Error preparing data for projected volume kube-api-access-jq84s for pod openshift-network-diagnostics/network-check-target-pz92q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:36:00.603142 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:00.603019 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s podName:92efbb3d-8bd0-413e-b306-331d80df0505 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:32.603005544 +0000 UTC m=+66.334759963 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq84s" (UniqueName: "kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s") pod "network-check-target-pz92q" (UID: "92efbb3d-8bd0-413e-b306-331d80df0505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:36:00.875794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:00.875706 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:00.875794 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:00.875747 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:00.876024 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:00.875838 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:36:00.876024 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:00.875947 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:36:02.875564 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:02.875541 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:02.875564 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:02.875541 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:02.875956 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:02.875683 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:36:02.875956 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:02.875786 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:36:03.037102 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:03.037071 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerStarted","Data":"eb06e2146f4f7b04c891c6cfc9aec121694eca4baddab8735d3cb01773896875"} Apr 23 16:36:04.040860 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:04.040825 2563 generic.go:358] "Generic (PLEG): container finished" podID="9db0f82d-4208-44c8-a818-ed7fcbd374fa" containerID="eb06e2146f4f7b04c891c6cfc9aec121694eca4baddab8735d3cb01773896875" exitCode=0 Apr 23 16:36:04.040860 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:04.040864 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerDied","Data":"eb06e2146f4f7b04c891c6cfc9aec121694eca4baddab8735d3cb01773896875"} Apr 23 16:36:04.864146 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:04.864095 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pz92q"] Apr 23 16:36:04.864623 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:04.864305 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:04.864623 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:04.864440 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:36:04.864802 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:04.864784 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpgxm"] Apr 23 16:36:04.864931 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:04.864915 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:04.865023 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:04.865009 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:36:05.045436 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:05.045410 2563 generic.go:358] "Generic (PLEG): container finished" podID="9db0f82d-4208-44c8-a818-ed7fcbd374fa" containerID="0471bce1a137c8bb5f9b6952cdac17b9a9ea173c8a2f9f8970016609db24f530" exitCode=0 Apr 23 16:36:05.046026 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:05.045450 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerDied","Data":"0471bce1a137c8bb5f9b6952cdac17b9a9ea173c8a2f9f8970016609db24f530"} Apr 23 16:36:06.050359 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:06.050143 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4k96t" event={"ID":"9db0f82d-4208-44c8-a818-ed7fcbd374fa","Type":"ContainerStarted","Data":"06495b0c96dcd00cb64c767619506e483acf72712fde64eceb325856c3c0c690"} Apr 23 16:36:06.076313 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:06.076270 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4k96t" podStartSLOduration=5.698330795 podStartE2EDuration="39.076255678s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.425866505 +0000 UTC m=+3.157620917" lastFinishedPulling="2026-04-23 16:36:02.803791367 +0000 UTC m=+36.535545800" observedRunningTime="2026-04-23 16:36:06.075737001 +0000 UTC m=+39.807491433" watchObservedRunningTime="2026-04-23 16:36:06.076255678 +0000 UTC m=+39.808010111" Apr 23 16:36:06.875811 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:06.875785 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:06.875966 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:06.875884 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pz92q" podUID="92efbb3d-8bd0-413e-b306-331d80df0505" Apr 23 16:36:06.875966 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:06.875945 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:06.876058 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:06.876039 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpgxm" podUID="c1dab98e-8f79-4056-94f4-9185da61ca34" Apr 23 16:36:08.617322 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.617248 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-187.ec2.internal" event="NodeReady" Apr 23 16:36:08.617740 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.617358 2563 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:36:08.652296 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.652266 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66b6dbc54f-dlrcz"] Apr 23 16:36:08.655766 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.655751 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.660138 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.660115 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 16:36:08.660138 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.660118 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 16:36:08.660311 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.660204 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-d4d8w\"" Apr 23 16:36:08.661698 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.661678 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:36:08.668114 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.668098 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 16:36:08.676039 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.676017 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-f2c8s"] Apr 23 16:36:08.679859 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.679844 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66b6dbc54f-dlrcz"] Apr 23 16:36:08.679960 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.679863 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-shfp9"] Apr 23 16:36:08.680022 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.680006 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.682790 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.682768 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:36:08.682892 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.682772 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:36:08.683057 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.683042 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:36:08.683127 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.683055 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:36:08.683127 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.683116 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6plmq\"" Apr 23 16:36:08.683638 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.683624 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.687100 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.687075 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:36:08.687100 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.687090 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-szgdw\"" Apr 23 16:36:08.687260 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.687119 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:36:08.691116 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.691096 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f2c8s"] Apr 23 16:36:08.696949 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.696889 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-shfp9"] Apr 23 16:36:08.763241 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763205 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-ca-trust-extracted\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.763327 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763246 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.763327 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763267 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-registry-tls\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.763411 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763331 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7458\" (UniqueName: \"kubernetes.io/projected/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-kube-api-access-z7458\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.763411 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763377 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-registry-certificates\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.763411 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763398 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-data-volume\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.763510 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763418 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-trusted-ca\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.763510 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763438 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-crio-socket\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.763510 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763463 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.763510 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763487 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-bound-sa-token\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.763644 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763509 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9gnv\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-kube-api-access-d9gnv\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.763644 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763530 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-image-registry-private-configuration\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.763644 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.763564 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-installation-pull-secrets\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.769820 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.769800 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ckfp9"] Apr 23 16:36:08.772801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.772789 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ckfp9" Apr 23 16:36:08.778618 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.778596 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:36:08.778704 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.778672 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:36:08.778704 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.778694 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w6ksl\"" Apr 23 16:36:08.778793 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.778692 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:36:08.785717 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.785700 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ckfp9"] Apr 23 16:36:08.864488 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864470 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-bound-sa-token\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.864582 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864496 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9gnv\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-kube-api-access-d9gnv\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.864582 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864522 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-image-registry-private-configuration\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.864582 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864544 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-installation-pull-secrets\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.864746 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864591 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c16a06-c713-441e-ba98-548b432943dd-config-volume\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.864746 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864632 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5whq\" (UniqueName: \"kubernetes.io/projected/56ca6981-847e-4fce-bb14-e0fa6f8fb697-kube-api-access-t5whq\") pod \"ingress-canary-ckfp9\" (UID: \"56ca6981-847e-4fce-bb14-e0fa6f8fb697\") " pod="openshift-ingress-canary/ingress-canary-ckfp9" Apr 23 16:36:08.864746 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864662 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms8sl\" (UniqueName: \"kubernetes.io/projected/75c16a06-c713-441e-ba98-548b432943dd-kube-api-access-ms8sl\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.864841 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864804 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-ca-trust-extracted\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.864875 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864836 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75c16a06-c713-441e-ba98-548b432943dd-tmp-dir\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.864875 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864867 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75c16a06-c713-441e-ba98-548b432943dd-metrics-tls\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.864954 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864910 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.865005 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.864981 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-registry-tls\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.865059 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865025 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7458\" (UniqueName: \"kubernetes.io/projected/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-kube-api-access-z7458\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.865107 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865061 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-registry-certificates\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.865107 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865087 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-data-volume\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.865201 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865119 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56ca6981-847e-4fce-bb14-e0fa6f8fb697-cert\") pod \"ingress-canary-ckfp9\" (UID: \"56ca6981-847e-4fce-bb14-e0fa6f8fb697\") " pod="openshift-ingress-canary/ingress-canary-ckfp9" Apr 23 16:36:08.865201 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865151 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-trusted-ca\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.865201 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865157 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-ca-trust-extracted\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.865201 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865181 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-crio-socket\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.865460 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865212 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.865460 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865450 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-data-volume\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.865563 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865537 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.865981 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.865945 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-crio-socket\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.869828 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.869770 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-registry-tls\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.869924 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.869855 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-image-registry-private-configuration\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.870279 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.870257 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-registry-certificates\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.870736 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.870696 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-trusted-ca\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.871089 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.871069 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-installation-pull-secrets\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.871798 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.871779 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.875056 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.875039 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7458\" (UniqueName: \"kubernetes.io/projected/a2d74b77-3933-4c89-8d3d-f2c3486cfb85-kube-api-access-z7458\") pod \"insights-runtime-extractor-f2c8s\" (UID: \"a2d74b77-3933-4c89-8d3d-f2c3486cfb85\") " pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.875201 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.875182 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:08.875358 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.875340 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:08.876023 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.875991 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-bound-sa-token\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.878086 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.878065 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9gnv\" (UniqueName: \"kubernetes.io/projected/f2764ea6-6412-43c9-9d81-1d51c9d17fe1-kube-api-access-d9gnv\") pod \"image-registry-66b6dbc54f-dlrcz\" (UID: \"f2764ea6-6412-43c9-9d81-1d51c9d17fe1\") " pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.879120 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.879102 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:36:08.879327 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.879262 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:36:08.879327 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.879320 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:36:08.879472 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.879325 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5zkmp\"" Apr 23 16:36:08.879472 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.879392 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hhjgv\"" Apr 23 16:36:08.964805 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.964785 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:08.966376 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.966359 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c16a06-c713-441e-ba98-548b432943dd-config-volume\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.966479 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.966392 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5whq\" (UniqueName: \"kubernetes.io/projected/56ca6981-847e-4fce-bb14-e0fa6f8fb697-kube-api-access-t5whq\") pod \"ingress-canary-ckfp9\" (UID: \"56ca6981-847e-4fce-bb14-e0fa6f8fb697\") " pod="openshift-ingress-canary/ingress-canary-ckfp9" Apr 23 16:36:08.966479 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.966416 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms8sl\" (UniqueName: \"kubernetes.io/projected/75c16a06-c713-441e-ba98-548b432943dd-kube-api-access-ms8sl\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.966590 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.966551 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75c16a06-c713-441e-ba98-548b432943dd-tmp-dir\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.966590 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.966585 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75c16a06-c713-441e-ba98-548b432943dd-metrics-tls\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.966675 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.966632 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56ca6981-847e-4fce-bb14-e0fa6f8fb697-cert\") pod \"ingress-canary-ckfp9\" (UID: \"56ca6981-847e-4fce-bb14-e0fa6f8fb697\") " pod="openshift-ingress-canary/ingress-canary-ckfp9" Apr 23 16:36:08.966801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.966786 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/75c16a06-c713-441e-ba98-548b432943dd-tmp-dir\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.967447 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.967429 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c16a06-c713-441e-ba98-548b432943dd-config-volume\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.968679 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.968661 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75c16a06-c713-441e-ba98-548b432943dd-metrics-tls\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.968755 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.968693 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56ca6981-847e-4fce-bb14-e0fa6f8fb697-cert\") pod \"ingress-canary-ckfp9\" (UID: \"56ca6981-847e-4fce-bb14-e0fa6f8fb697\") " pod="openshift-ingress-canary/ingress-canary-ckfp9" Apr 23 16:36:08.979954 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.979933 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms8sl\" (UniqueName: \"kubernetes.io/projected/75c16a06-c713-441e-ba98-548b432943dd-kube-api-access-ms8sl\") pod \"dns-default-shfp9\" (UID: \"75c16a06-c713-441e-ba98-548b432943dd\") " pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:08.981117 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.981102 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5whq\" (UniqueName: \"kubernetes.io/projected/56ca6981-847e-4fce-bb14-e0fa6f8fb697-kube-api-access-t5whq\") pod \"ingress-canary-ckfp9\" (UID: \"56ca6981-847e-4fce-bb14-e0fa6f8fb697\") " pod="openshift-ingress-canary/ingress-canary-ckfp9" Apr 23 16:36:08.988955 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.988934 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f2c8s" Apr 23 16:36:08.993520 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:08.993504 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:09.081589 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:09.081553 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ckfp9" Apr 23 16:36:09.117680 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:09.117650 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66b6dbc54f-dlrcz"] Apr 23 16:36:09.121359 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:09.121220 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2764ea6_6412_43c9_9d81_1d51c9d17fe1.slice/crio-4c4d825fc8abcd3de535bebafe9a7423d85570afb375decf28d571744fd5b676 WatchSource:0}: Error finding container 4c4d825fc8abcd3de535bebafe9a7423d85570afb375decf28d571744fd5b676: Status 404 returned error can't find the container with id 4c4d825fc8abcd3de535bebafe9a7423d85570afb375decf28d571744fd5b676 Apr 23 16:36:09.131294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:09.131273 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-shfp9"] Apr 23 16:36:09.134729 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:09.134551 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f2c8s"] Apr 23 16:36:09.230136 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:09.230117 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ckfp9"] Apr 23 16:36:09.232528 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:09.232504 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ca6981_847e_4fce_bb14_e0fa6f8fb697.slice/crio-3305f5fc0df610fe6840b76240ead5be9e440dbf353190e4ff813271e3659668 WatchSource:0}: Error finding container 3305f5fc0df610fe6840b76240ead5be9e440dbf353190e4ff813271e3659668: Status 404 returned error can't find the container with id 3305f5fc0df610fe6840b76240ead5be9e440dbf353190e4ff813271e3659668 Apr 23 16:36:10.060163 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.060129 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f2c8s" event={"ID":"a2d74b77-3933-4c89-8d3d-f2c3486cfb85","Type":"ContainerStarted","Data":"cf2c877997f60ec193533e5ab3c396e26bda76037d760c26ea75f5b128571d61"} Apr 23 16:36:10.060888 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.060169 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f2c8s" event={"ID":"a2d74b77-3933-4c89-8d3d-f2c3486cfb85","Type":"ContainerStarted","Data":"97d5b7028fb4e8f727465f81efa714e5def8e1a587c7b51e0551c93d08bb8fef"} Apr 23 16:36:10.060888 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.060180 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f2c8s" event={"ID":"a2d74b77-3933-4c89-8d3d-f2c3486cfb85","Type":"ContainerStarted","Data":"d79ef9158bd583b7b4ce1dc1b9ed16b344c91a413ed61aacf718279a418b3bc2"} Apr 23 16:36:10.061212 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.061190 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ckfp9" event={"ID":"56ca6981-847e-4fce-bb14-e0fa6f8fb697","Type":"ContainerStarted","Data":"3305f5fc0df610fe6840b76240ead5be9e440dbf353190e4ff813271e3659668"} Apr 23 16:36:10.062242 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.062205 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-shfp9" event={"ID":"75c16a06-c713-441e-ba98-548b432943dd","Type":"ContainerStarted","Data":"df119df062d51b1a2b955ec02d17924a0f11735ae1902019c261dfb4a8836087"} Apr 23 16:36:10.063618 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.063590 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" event={"ID":"f2764ea6-6412-43c9-9d81-1d51c9d17fe1","Type":"ContainerStarted","Data":"8fee959461f7dd0c395bc414e1f2dabb1fbf86e8f43577e3ab4f75b162c3219a"} Apr 23 16:36:10.063723 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.063626 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" event={"ID":"f2764ea6-6412-43c9-9d81-1d51c9d17fe1","Type":"ContainerStarted","Data":"4c4d825fc8abcd3de535bebafe9a7423d85570afb375decf28d571744fd5b676"} Apr 23 16:36:10.063779 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.063753 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:10.090296 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:10.090211 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" podStartSLOduration=7.090194626 podStartE2EDuration="7.090194626s" podCreationTimestamp="2026-04-23 16:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:36:10.088549593 +0000 UTC m=+43.820304030" watchObservedRunningTime="2026-04-23 16:36:10.090194626 +0000 UTC m=+43.821949055" Apr 23 16:36:11.121844 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.121809 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5857d7c9bd-8qtc9"] Apr 23 16:36:11.124758 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.124733 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.128920 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.128898 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 16:36:11.129048 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.128901 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 16:36:11.129048 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.128940 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-g68ff\"" Apr 23 16:36:11.129048 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.128972 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 16:36:11.129048 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.129003 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 16:36:11.129271 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.129246 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 16:36:11.129417 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.129396 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 16:36:11.131475 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.131337 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 16:36:11.140975 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.140955 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5857d7c9bd-8qtc9"] Apr 23 16:36:11.284437 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.284405 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-serving-cert\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.284437 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.284443 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-oauth-config\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.284697 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.284482 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-oauth-serving-cert\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.284697 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.284548 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-config\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.284697 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.284599 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27sxz\" (UniqueName: \"kubernetes.io/projected/2d21efa6-4336-43fd-a3c1-3c469e5983e1-kube-api-access-27sxz\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.284697 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.284681 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-service-ca\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.386103 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.386031 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-serving-cert\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.386103 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.386067 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-oauth-config\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.386103 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.386091 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-oauth-serving-cert\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.386365 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.386118 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-config\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.386365 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.386143 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27sxz\" (UniqueName: \"kubernetes.io/projected/2d21efa6-4336-43fd-a3c1-3c469e5983e1-kube-api-access-27sxz\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.386365 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.386206 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-service-ca\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.387497 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.387472 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-oauth-serving-cert\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.388196 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.387763 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-config\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.388196 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.387782 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-service-ca\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.389532 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.389512 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-serving-cert\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.389532 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.389518 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-oauth-config\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.398988 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.398962 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27sxz\" (UniqueName: \"kubernetes.io/projected/2d21efa6-4336-43fd-a3c1-3c469e5983e1-kube-api-access-27sxz\") pod \"console-5857d7c9bd-8qtc9\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.434857 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.434833 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:11.995463 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:11.995213 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5857d7c9bd-8qtc9"] Apr 23 16:36:11.999379 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:11.999355 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d21efa6_4336_43fd_a3c1_3c469e5983e1.slice/crio-32b3dfd9b140707b72f4cab1955d238c3d2025f2fb5ad168d807b082929b4c5d WatchSource:0}: Error finding container 32b3dfd9b140707b72f4cab1955d238c3d2025f2fb5ad168d807b082929b4c5d: Status 404 returned error can't find the container with id 32b3dfd9b140707b72f4cab1955d238c3d2025f2fb5ad168d807b082929b4c5d Apr 23 16:36:12.072055 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:12.071939 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-shfp9" event={"ID":"75c16a06-c713-441e-ba98-548b432943dd","Type":"ContainerStarted","Data":"8a7c41146a3bf977409433a1d1a6e572bf741aaba8eb7a7a85702c4ae0777aed"} Apr 23 16:36:12.073911 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:12.073884 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f2c8s" event={"ID":"a2d74b77-3933-4c89-8d3d-f2c3486cfb85","Type":"ContainerStarted","Data":"9ffa35c819c78c71f9ce245ac0f36bf9f03dec181da64c745a08f06a2d68848a"} Apr 23 16:36:12.075360 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:12.075333 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ckfp9" event={"ID":"56ca6981-847e-4fce-bb14-e0fa6f8fb697","Type":"ContainerStarted","Data":"0d8c102c89494e559c1f133588a3e0290cb4c2ce457b7d4af7a7d1adb184ca76"} Apr 23 16:36:12.076515 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:12.076477 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5857d7c9bd-8qtc9" event={"ID":"2d21efa6-4336-43fd-a3c1-3c469e5983e1","Type":"ContainerStarted","Data":"32b3dfd9b140707b72f4cab1955d238c3d2025f2fb5ad168d807b082929b4c5d"} Apr 23 16:36:12.095466 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:12.095429 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-f2c8s" podStartSLOduration=1.479501561 podStartE2EDuration="4.095414637s" podCreationTimestamp="2026-04-23 16:36:08 +0000 UTC" firstStartedPulling="2026-04-23 16:36:09.228470602 +0000 UTC m=+42.960225016" lastFinishedPulling="2026-04-23 16:36:11.844383681 +0000 UTC m=+45.576138092" observedRunningTime="2026-04-23 16:36:12.094217181 +0000 UTC m=+45.825971613" watchObservedRunningTime="2026-04-23 16:36:12.095414637 +0000 UTC m=+45.827169070" Apr 23 16:36:12.115530 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:12.115492 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ckfp9" podStartSLOduration=1.507983442 podStartE2EDuration="4.115478904s" podCreationTimestamp="2026-04-23 16:36:08 +0000 UTC" firstStartedPulling="2026-04-23 16:36:09.234442517 +0000 UTC m=+42.966196943" lastFinishedPulling="2026-04-23 16:36:11.841937993 +0000 UTC m=+45.573692405" observedRunningTime="2026-04-23 16:36:12.11498848 +0000 UTC m=+45.846742913" watchObservedRunningTime="2026-04-23 16:36:12.115478904 +0000 UTC m=+45.847233327" Apr 23 16:36:13.080643 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:13.080589 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-shfp9" event={"ID":"75c16a06-c713-441e-ba98-548b432943dd","Type":"ContainerStarted","Data":"9d63b7a230c88bae8d9e7a77386851a0e73c69327e826efc434956224817b5de"} Apr 23 16:36:13.106731 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:13.106679 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-shfp9" podStartSLOduration=2.41211844 podStartE2EDuration="5.106659679s" podCreationTimestamp="2026-04-23 16:36:08 +0000 UTC" firstStartedPulling="2026-04-23 16:36:09.147638847 +0000 UTC m=+42.879393264" lastFinishedPulling="2026-04-23 16:36:11.842180084 +0000 UTC m=+45.573934503" observedRunningTime="2026-04-23 16:36:13.105873949 +0000 UTC m=+46.837628405" watchObservedRunningTime="2026-04-23 16:36:13.106659679 +0000 UTC m=+46.838414115" Apr 23 16:36:14.084060 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:14.083849 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:16.090737 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:16.090698 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5857d7c9bd-8qtc9" event={"ID":"2d21efa6-4336-43fd-a3c1-3c469e5983e1","Type":"ContainerStarted","Data":"edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc"} Apr 23 16:36:16.110713 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:16.110669 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5857d7c9bd-8qtc9" podStartSLOduration=2.050101601 podStartE2EDuration="5.110656151s" podCreationTimestamp="2026-04-23 16:36:11 +0000 UTC" firstStartedPulling="2026-04-23 16:36:12.00218271 +0000 UTC m=+45.733937128" lastFinishedPulling="2026-04-23 16:36:15.062737263 +0000 UTC m=+48.794491678" observedRunningTime="2026-04-23 16:36:16.109426604 +0000 UTC m=+49.841181038" watchObservedRunningTime="2026-04-23 16:36:16.110656151 +0000 UTC m=+49.842410583" Apr 23 16:36:18.347102 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.347071 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g"] Apr 23 16:36:18.373420 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.373391 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-m2ztp"] Apr 23 16:36:18.373554 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.373538 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.377966 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.377949 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 16:36:18.378457 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.378439 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:36:18.379300 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.379281 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 16:36:18.379398 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.379314 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:36:18.379398 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.379377 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:36:18.382319 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.382302 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8g6mm\"" Apr 23 16:36:18.388758 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.388741 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g"] Apr 23 16:36:18.388758 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.388762 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xtq25"] Apr 23 16:36:18.388911 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.388881 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.391785 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.391767 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 16:36:18.391874 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.391811 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-qfn7k\"" Apr 23 16:36:18.392058 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.392039 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 16:36:18.392170 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.392092 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 16:36:18.400995 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.400976 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-m2ztp"] Apr 23 16:36:18.401099 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.401087 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.405005 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.404986 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:36:18.405081 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.405011 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:36:18.405296 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.405281 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k5wxx\"" Apr 23 16:36:18.405435 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.405387 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:36:18.536120 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536097 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2czz5\" (UniqueName: \"kubernetes.io/projected/607bdd08-a8f9-4d7e-8f45-913d893a9763-kube-api-access-2czz5\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.536255 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536138 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/568bdcdb-d09b-4f63-8775-e55efec84c8e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.536255 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536157 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-wtmp\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536255 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536217 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536359 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536285 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/568bdcdb-d09b-4f63-8775-e55efec84c8e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.536359 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536315 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-textfile\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536359 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536346 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.536452 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536365 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607bdd08-a8f9-4d7e-8f45-913d893a9763-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.536452 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536393 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ffcd3636-7eaf-487e-b56b-788842ea51bb-metrics-client-ca\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536452 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536428 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74xbg\" (UniqueName: \"kubernetes.io/projected/ffcd3636-7eaf-487e-b56b-788842ea51bb-kube-api-access-74xbg\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536452 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536447 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607bdd08-a8f9-4d7e-8f45-913d893a9763-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.536570 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536464 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-root\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536570 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536490 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.536570 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536513 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/607bdd08-a8f9-4d7e-8f45-913d893a9763-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.536570 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536540 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-sys\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536570 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536563 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-tls\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536713 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536581 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.536713 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536598 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7h9m\" (UniqueName: \"kubernetes.io/projected/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-api-access-t7h9m\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.536713 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.536654 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.637318 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637258 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.637318 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637296 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2czz5\" (UniqueName: \"kubernetes.io/projected/607bdd08-a8f9-4d7e-8f45-913d893a9763-kube-api-access-2czz5\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.637483 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637327 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/568bdcdb-d09b-4f63-8775-e55efec84c8e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.637483 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637344 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-wtmp\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637483 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637363 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637483 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637380 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/568bdcdb-d09b-4f63-8775-e55efec84c8e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.637483 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637401 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-textfile\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637483 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637436 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.637483 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637465 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607bdd08-a8f9-4d7e-8f45-913d893a9763-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637503 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ffcd3636-7eaf-487e-b56b-788842ea51bb-metrics-client-ca\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637527 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74xbg\" (UniqueName: \"kubernetes.io/projected/ffcd3636-7eaf-487e-b56b-788842ea51bb-kube-api-access-74xbg\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637552 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607bdd08-a8f9-4d7e-8f45-913d893a9763-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637554 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-wtmp\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637592 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-root\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637611 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637630 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/607bdd08-a8f9-4d7e-8f45-913d893a9763-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637681 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-root\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637690 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-textfile\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637733 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-sys\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637763 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-tls\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637792 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.637824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.637820 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7h9m\" (UniqueName: \"kubernetes.io/projected/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-api-access-t7h9m\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.638447 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.638063 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.638447 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.638096 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffcd3636-7eaf-487e-b56b-788842ea51bb-sys\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.638447 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.638198 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/568bdcdb-d09b-4f63-8775-e55efec84c8e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.639066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.638804 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607bdd08-a8f9-4d7e-8f45-913d893a9763-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.639066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.639038 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.639379 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.639182 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ffcd3636-7eaf-487e-b56b-788842ea51bb-metrics-client-ca\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.639379 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.639254 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/568bdcdb-d09b-4f63-8775-e55efec84c8e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.641677 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.641629 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.641805 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.641712 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.641805 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.641757 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ffcd3636-7eaf-487e-b56b-788842ea51bb-node-exporter-tls\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.641924 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.641834 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/607bdd08-a8f9-4d7e-8f45-913d893a9763-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.641924 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.641868 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.641924 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.641880 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607bdd08-a8f9-4d7e-8f45-913d893a9763-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.647325 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.647303 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7h9m\" (UniqueName: \"kubernetes.io/projected/568bdcdb-d09b-4f63-8775-e55efec84c8e-kube-api-access-t7h9m\") pod \"kube-state-metrics-69db897b98-m2ztp\" (UID: \"568bdcdb-d09b-4f63-8775-e55efec84c8e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.647580 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.647563 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2czz5\" (UniqueName: \"kubernetes.io/projected/607bdd08-a8f9-4d7e-8f45-913d893a9763-kube-api-access-2czz5\") pod \"openshift-state-metrics-9d44df66c-v8g6g\" (UID: \"607bdd08-a8f9-4d7e-8f45-913d893a9763\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.650374 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.650354 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74xbg\" (UniqueName: \"kubernetes.io/projected/ffcd3636-7eaf-487e-b56b-788842ea51bb-kube-api-access-74xbg\") pod \"node-exporter-xtq25\" (UID: \"ffcd3636-7eaf-487e-b56b-788842ea51bb\") " pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.682295 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.682275 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" Apr 23 16:36:18.697085 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.696959 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" Apr 23 16:36:18.708692 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.708672 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtq25" Apr 23 16:36:18.728811 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:18.728776 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffcd3636_7eaf_487e_b56b_788842ea51bb.slice/crio-9033b1226fbbbfc8eb8b98eab4b147a641d412e175899f3ed1f7d95e723f5d5b WatchSource:0}: Error finding container 9033b1226fbbbfc8eb8b98eab4b147a641d412e175899f3ed1f7d95e723f5d5b: Status 404 returned error can't find the container with id 9033b1226fbbbfc8eb8b98eab4b147a641d412e175899f3ed1f7d95e723f5d5b Apr 23 16:36:18.813083 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.813021 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g"] Apr 23 16:36:18.817815 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:18.817789 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607bdd08_a8f9_4d7e_8f45_913d893a9763.slice/crio-b1d40cd5e6d4374ea862c260280467d420408ac19a28941bc740a488d0a7903f WatchSource:0}: Error finding container b1d40cd5e6d4374ea862c260280467d420408ac19a28941bc740a488d0a7903f: Status 404 returned error can't find the container with id b1d40cd5e6d4374ea862c260280467d420408ac19a28941bc740a488d0a7903f Apr 23 16:36:18.828142 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:18.828071 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-m2ztp"] Apr 23 16:36:18.831179 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:18.831156 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod568bdcdb_d09b_4f63_8775_e55efec84c8e.slice/crio-4b445b0c2d27e135faeecc499232dea6104df7155f9a2cc30b56a24e33a97630 WatchSource:0}: Error finding container 4b445b0c2d27e135faeecc499232dea6104df7155f9a2cc30b56a24e33a97630: Status 404 returned error can't find the container with id 4b445b0c2d27e135faeecc499232dea6104df7155f9a2cc30b56a24e33a97630 Apr 23 16:36:19.098672 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.098641 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtq25" event={"ID":"ffcd3636-7eaf-487e-b56b-788842ea51bb","Type":"ContainerStarted","Data":"9033b1226fbbbfc8eb8b98eab4b147a641d412e175899f3ed1f7d95e723f5d5b"} Apr 23 16:36:19.099639 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.099596 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" event={"ID":"568bdcdb-d09b-4f63-8775-e55efec84c8e","Type":"ContainerStarted","Data":"4b445b0c2d27e135faeecc499232dea6104df7155f9a2cc30b56a24e33a97630"} Apr 23 16:36:19.101128 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.101105 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" event={"ID":"607bdd08-a8f9-4d7e-8f45-913d893a9763","Type":"ContainerStarted","Data":"00324c76a9d381d3cae07ba7f7393feaa088d303d0d289a613a115480681b934"} Apr 23 16:36:19.101205 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.101132 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" event={"ID":"607bdd08-a8f9-4d7e-8f45-913d893a9763","Type":"ContainerStarted","Data":"75a51f8c5b69f733178e8709c383a6d16d835293396e2e6f214475aaf52b0ab7"} Apr 23 16:36:19.101205 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.101141 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" event={"ID":"607bdd08-a8f9-4d7e-8f45-913d893a9763","Type":"ContainerStarted","Data":"b1d40cd5e6d4374ea862c260280467d420408ac19a28941bc740a488d0a7903f"} Apr 23 16:36:19.379097 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.379067 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:19.391870 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.391844 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.395496 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.394947 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 16:36:19.395496 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395101 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 16:36:19.395496 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395350 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 16:36:19.395496 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395359 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 16:36:19.395496 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395375 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 16:36:19.396539 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395601 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 16:36:19.396539 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395731 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 16:36:19.396539 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395748 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 16:36:19.396539 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395790 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-x8t2t\"" Apr 23 16:36:19.396539 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.395942 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 16:36:19.398100 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.398061 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:19.547999 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.547965 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548064 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-volume\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548177 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548109 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548323 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548203 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548323 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548307 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-out\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548441 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548348 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548441 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548384 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548441 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548431 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548590 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548506 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548590 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548561 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnzsd\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-kube-api-access-xnzsd\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548684 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548590 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-web-config\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548684 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548644 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.548684 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.548673 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649502 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649432 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-out\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649502 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649482 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649702 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649521 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649702 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649547 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649702 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649663 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649702 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:19.649680 2563 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 16:36:19.649879 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649728 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnzsd\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-kube-api-access-xnzsd\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649879 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:19.649759 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls podName:10b9ed9e-751b-41ff-8d50-5ab4091afc4b nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.149736435 +0000 UTC m=+53.881490849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b") : secret "alertmanager-main-tls" not found Apr 23 16:36:19.649879 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649797 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-web-config\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649879 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649860 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.649879 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649873 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.650118 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649893 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.650118 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.649936 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.650118 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.650002 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-volume\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.650118 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.650040 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.650118 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.650086 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.652860 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.651623 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.652860 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.652542 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.655667 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.655622 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-volume\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.656040 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.655996 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.657213 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.656577 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.657213 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.656649 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.657213 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.656934 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-out\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.657213 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.657166 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.657485 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.657283 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-web-config\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.659429 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.659408 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:19.662138 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:19.662117 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnzsd\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-kube-api-access-xnzsd\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:20.153954 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:20.153913 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:20.157395 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:20.157162 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:20.311587 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:20.311551 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:20.959165 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:20.959021 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:20.959933 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:20.959730 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b9ed9e_751b_41ff_8d50_5ab4091afc4b.slice/crio-80cdf69b6cb74b49b26a8eba947bc518b94c30e9d1da4c1adc35591476cf7407 WatchSource:0}: Error finding container 80cdf69b6cb74b49b26a8eba947bc518b94c30e9d1da4c1adc35591476cf7407: Status 404 returned error can't find the container with id 80cdf69b6cb74b49b26a8eba947bc518b94c30e9d1da4c1adc35591476cf7407 Apr 23 16:36:21.097016 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.096960 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-shfp9" Apr 23 16:36:21.110548 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.110311 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" event={"ID":"607bdd08-a8f9-4d7e-8f45-913d893a9763","Type":"ContainerStarted","Data":"c8892b5d025135bb545f8b5086ff1706e15ca4b33d6f7554d1e3f567027f4462"} Apr 23 16:36:21.111815 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.111752 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerStarted","Data":"80cdf69b6cb74b49b26a8eba947bc518b94c30e9d1da4c1adc35591476cf7407"} Apr 23 16:36:21.113545 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.113514 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtq25" event={"ID":"ffcd3636-7eaf-487e-b56b-788842ea51bb","Type":"ContainerStarted","Data":"922cbcc116e46687489d00d11feb94caf955fa14b6cacdbddf1ae3f0f3055fb6"} Apr 23 16:36:21.115914 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.115867 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" event={"ID":"568bdcdb-d09b-4f63-8775-e55efec84c8e","Type":"ContainerStarted","Data":"fc9ba59eba38d08de903825199e03a2bbba920424a54c88c0b4fafddbf33e6b9"} Apr 23 16:36:21.115914 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.115893 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" event={"ID":"568bdcdb-d09b-4f63-8775-e55efec84c8e","Type":"ContainerStarted","Data":"b8747fcba8b84c2c2bd493b43be4388034ec907ecd80aaff9b72c43e41252e96"} Apr 23 16:36:21.164325 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.164276 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-v8g6g" podStartSLOduration=1.374500994 podStartE2EDuration="3.164258523s" podCreationTimestamp="2026-04-23 16:36:18 +0000 UTC" firstStartedPulling="2026-04-23 16:36:19.03642035 +0000 UTC m=+52.768174761" lastFinishedPulling="2026-04-23 16:36:20.826177878 +0000 UTC m=+54.557932290" observedRunningTime="2026-04-23 16:36:21.163463807 +0000 UTC m=+54.895218240" watchObservedRunningTime="2026-04-23 16:36:21.164258523 +0000 UTC m=+54.896012957" Apr 23 16:36:21.435415 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.435330 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:21.435558 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.435486 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:21.440599 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:21.440578 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:22.120858 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:22.120787 2563 generic.go:358] "Generic (PLEG): container finished" podID="ffcd3636-7eaf-487e-b56b-788842ea51bb" containerID="922cbcc116e46687489d00d11feb94caf955fa14b6cacdbddf1ae3f0f3055fb6" exitCode=0 Apr 23 16:36:22.121322 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:22.120868 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtq25" event={"ID":"ffcd3636-7eaf-487e-b56b-788842ea51bb","Type":"ContainerDied","Data":"922cbcc116e46687489d00d11feb94caf955fa14b6cacdbddf1ae3f0f3055fb6"} Apr 23 16:36:22.122808 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:22.122783 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" event={"ID":"568bdcdb-d09b-4f63-8775-e55efec84c8e","Type":"ContainerStarted","Data":"f9c36272fd42187dbfaaada3b41655c32f38cc94e28e6a3fd8b718cbfbb9a59d"} Apr 23 16:36:22.126811 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:22.126796 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:36:22.178085 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:22.178049 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-m2ztp" podStartSLOduration=2.185947653 podStartE2EDuration="4.178037557s" podCreationTimestamp="2026-04-23 16:36:18 +0000 UTC" firstStartedPulling="2026-04-23 16:36:18.832814184 +0000 UTC m=+52.564568594" lastFinishedPulling="2026-04-23 16:36:20.824904073 +0000 UTC m=+54.556658498" observedRunningTime="2026-04-23 16:36:22.177569392 +0000 UTC m=+55.909323850" watchObservedRunningTime="2026-04-23 16:36:22.178037557 +0000 UTC m=+55.909791990" Apr 23 16:36:23.126345 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:23.126312 2563 generic.go:358] "Generic (PLEG): container finished" podID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerID="ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238" exitCode=0 Apr 23 16:36:23.126727 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:23.126398 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerDied","Data":"ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238"} Apr 23 16:36:23.128373 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:23.128352 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtq25" event={"ID":"ffcd3636-7eaf-487e-b56b-788842ea51bb","Type":"ContainerStarted","Data":"9efe393b15462714f34004ee0cf2864aae8ebe544dc99f4f549bda4f695dc106"} Apr 23 16:36:23.128457 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:23.128380 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtq25" event={"ID":"ffcd3636-7eaf-487e-b56b-788842ea51bb","Type":"ContainerStarted","Data":"ba01adb08a62853ac738657ac93735ead348efd451e602db7e0fbb98e933ce63"} Apr 23 16:36:23.183921 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:23.183877 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xtq25" podStartSLOduration=3.135268432 podStartE2EDuration="5.183862438s" podCreationTimestamp="2026-04-23 16:36:18 +0000 UTC" firstStartedPulling="2026-04-23 16:36:18.733193834 +0000 UTC m=+52.464948247" lastFinishedPulling="2026-04-23 16:36:20.781787829 +0000 UTC m=+54.513542253" observedRunningTime="2026-04-23 16:36:23.183106873 +0000 UTC m=+56.914861306" watchObservedRunningTime="2026-04-23 16:36:23.183862438 +0000 UTC m=+56.915616870" Apr 23 16:36:25.034298 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:25.034275 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc9pq" Apr 23 16:36:25.135187 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:25.135158 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerStarted","Data":"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551"} Apr 23 16:36:26.140152 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:26.140119 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerStarted","Data":"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc"} Apr 23 16:36:26.140152 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:26.140151 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerStarted","Data":"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74"} Apr 23 16:36:26.140152 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:26.140160 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerStarted","Data":"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06"} Apr 23 16:36:26.140569 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:26.140169 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerStarted","Data":"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840"} Apr 23 16:36:27.145404 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:27.145346 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerStarted","Data":"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7"} Apr 23 16:36:27.176185 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:27.176048 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.255240633 podStartE2EDuration="8.176031979s" podCreationTimestamp="2026-04-23 16:36:19 +0000 UTC" firstStartedPulling="2026-04-23 16:36:20.96161747 +0000 UTC m=+54.693371881" lastFinishedPulling="2026-04-23 16:36:26.882408813 +0000 UTC m=+60.614163227" observedRunningTime="2026-04-23 16:36:27.174205645 +0000 UTC m=+60.905960076" watchObservedRunningTime="2026-04-23 16:36:27.176031979 +0000 UTC m=+60.907786411" Apr 23 16:36:29.947446 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:29.947413 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66cf6458f5-6d758"] Apr 23 16:36:29.950751 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:29.950729 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:29.959484 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:29.959398 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 16:36:29.960593 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:29.960574 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66cf6458f5-6d758"] Apr 23 16:36:30.040027 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.040004 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-serving-cert\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.040118 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.040038 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-trusted-ca-bundle\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.040118 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.040109 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-oauth-serving-cert\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.040187 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.040140 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzq9r\" (UniqueName: \"kubernetes.io/projected/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-kube-api-access-xzq9r\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.040220 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.040184 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-config\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.040220 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.040204 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-service-ca\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.040308 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.040281 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-oauth-config\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.140926 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.140901 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-config\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.140926 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.140928 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-service-ca\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.141045 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.140958 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-oauth-config\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.141089 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.141069 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-serving-cert\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.141137 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.141122 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-trusted-ca-bundle\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.141207 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.141194 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-oauth-serving-cert\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.141374 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.141354 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzq9r\" (UniqueName: \"kubernetes.io/projected/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-kube-api-access-xzq9r\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.141740 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.141720 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-service-ca\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.141829 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.141723 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-config\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.141928 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.141908 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-trusted-ca-bundle\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.142027 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.142007 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-oauth-serving-cert\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.143583 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.143564 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-oauth-config\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.143668 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.143643 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-serving-cert\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.151830 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.151808 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzq9r\" (UniqueName: \"kubernetes.io/projected/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-kube-api-access-xzq9r\") pod \"console-66cf6458f5-6d758\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.260060 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.260040 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:30.377015 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:30.376985 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66cf6458f5-6d758"] Apr 23 16:36:30.379926 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:30.379900 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e1e9ca_a3d0_45ca_bc07_5e504e9b56ab.slice/crio-39d3949d0bbe89237cc4f51ba4d14db85aa41b3aaa13d6e8dffe746ac8283a09 WatchSource:0}: Error finding container 39d3949d0bbe89237cc4f51ba4d14db85aa41b3aaa13d6e8dffe746ac8283a09: Status 404 returned error can't find the container with id 39d3949d0bbe89237cc4f51ba4d14db85aa41b3aaa13d6e8dffe746ac8283a09 Apr 23 16:36:31.070731 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:31.070704 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66b6dbc54f-dlrcz" Apr 23 16:36:31.157560 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:31.157528 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cf6458f5-6d758" event={"ID":"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab","Type":"ContainerStarted","Data":"ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d"} Apr 23 16:36:31.157712 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:31.157563 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cf6458f5-6d758" event={"ID":"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab","Type":"ContainerStarted","Data":"39d3949d0bbe89237cc4f51ba4d14db85aa41b3aaa13d6e8dffe746ac8283a09"} Apr 23 16:36:31.190689 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:31.190645 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66cf6458f5-6d758" podStartSLOduration=2.190631189 podStartE2EDuration="2.190631189s" podCreationTimestamp="2026-04-23 16:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:36:31.189480152 +0000 UTC m=+64.921234599" watchObservedRunningTime="2026-04-23 16:36:31.190631189 +0000 UTC m=+64.922385624" Apr 23 16:36:32.045458 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.045427 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66cf6458f5-6d758"] Apr 23 16:36:32.086288 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.086268 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-779d45b66c-plqnq"] Apr 23 16:36:32.089455 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.089440 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.099402 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.099382 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-779d45b66c-plqnq"] Apr 23 16:36:32.157259 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.157216 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-oauth-config\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.157356 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.157265 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-serving-cert\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.157356 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.157295 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crslg\" (UniqueName: \"kubernetes.io/projected/3bad8bf0-d9c8-49db-9204-4692a18b0e26-kube-api-access-crslg\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.157356 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.157335 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-trusted-ca-bundle\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.157467 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.157395 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-config\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.157467 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.157421 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-service-ca\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.157539 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.157471 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-oauth-serving-cert\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.258427 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.258401 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-trusted-ca-bundle\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.258571 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.258443 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-config\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.258571 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.258471 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-service-ca\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.258817 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.258790 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-oauth-serving-cert\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.258926 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.258905 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-oauth-config\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.258988 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.258943 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-serving-cert\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.258988 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.258976 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crslg\" (UniqueName: \"kubernetes.io/projected/3bad8bf0-d9c8-49db-9204-4692a18b0e26-kube-api-access-crslg\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.259271 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.259222 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-config\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.259380 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.259357 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-service-ca\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.259478 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.259448 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-trusted-ca-bundle\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.259594 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.259502 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-oauth-serving-cert\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.261501 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.261476 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-serving-cert\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.261501 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.261491 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-oauth-config\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.268804 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.268786 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crslg\" (UniqueName: \"kubernetes.io/projected/3bad8bf0-d9c8-49db-9204-4692a18b0e26-kube-api-access-crslg\") pod \"console-779d45b66c-plqnq\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.398175 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.398102 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:32.550305 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.550273 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-779d45b66c-plqnq"] Apr 23 16:36:32.552814 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:32.552792 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bad8bf0_d9c8_49db_9204_4692a18b0e26.slice/crio-e2708b0bfcd20d1b86f8ba3365ac6f4425ebb72404ab4cf0a970abeef7512e1b WatchSource:0}: Error finding container e2708b0bfcd20d1b86f8ba3365ac6f4425ebb72404ab4cf0a970abeef7512e1b: Status 404 returned error can't find the container with id e2708b0bfcd20d1b86f8ba3365ac6f4425ebb72404ab4cf0a970abeef7512e1b Apr 23 16:36:32.566209 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.566187 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:32.568898 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.568880 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:36:32.578765 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.578741 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1dab98e-8f79-4056-94f4-9185da61ca34-metrics-certs\") pod \"network-metrics-daemon-kpgxm\" (UID: \"c1dab98e-8f79-4056-94f4-9185da61ca34\") " pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:32.597076 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.597059 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5zkmp\"" Apr 23 16:36:32.604463 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.604442 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpgxm" Apr 23 16:36:32.667501 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.667471 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:32.671596 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.671572 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:36:32.680703 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.680506 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:36:32.690633 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.690610 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq84s\" (UniqueName: \"kubernetes.io/projected/92efbb3d-8bd0-413e-b306-331d80df0505-kube-api-access-jq84s\") pod \"network-check-target-pz92q\" (UID: \"92efbb3d-8bd0-413e-b306-331d80df0505\") " pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:32.725247 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.725210 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpgxm"] Apr 23 16:36:32.727470 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:32.727443 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1dab98e_8f79_4056_94f4_9185da61ca34.slice/crio-3d15417b7d86131808e801a4a245d785530a0b2b82a0ead558b310b28e1be957 WatchSource:0}: Error finding container 3d15417b7d86131808e801a4a245d785530a0b2b82a0ead558b310b28e1be957: Status 404 returned error can't find the container with id 3d15417b7d86131808e801a4a245d785530a0b2b82a0ead558b310b28e1be957 Apr 23 16:36:32.901605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.901556 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hhjgv\"" Apr 23 16:36:32.909291 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:32.909276 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:33.034424 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:33.034391 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pz92q"] Apr 23 16:36:33.039056 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:36:33.039030 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92efbb3d_8bd0_413e_b306_331d80df0505.slice/crio-757d10c6b841016e524676f9045ca8123f3a87ad08c825fd4a0707efd55ade9c WatchSource:0}: Error finding container 757d10c6b841016e524676f9045ca8123f3a87ad08c825fd4a0707efd55ade9c: Status 404 returned error can't find the container with id 757d10c6b841016e524676f9045ca8123f3a87ad08c825fd4a0707efd55ade9c Apr 23 16:36:33.164211 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:33.164135 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pz92q" event={"ID":"92efbb3d-8bd0-413e-b306-331d80df0505","Type":"ContainerStarted","Data":"757d10c6b841016e524676f9045ca8123f3a87ad08c825fd4a0707efd55ade9c"} Apr 23 16:36:33.165741 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:33.165715 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779d45b66c-plqnq" event={"ID":"3bad8bf0-d9c8-49db-9204-4692a18b0e26","Type":"ContainerStarted","Data":"7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7"} Apr 23 16:36:33.165857 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:33.165751 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779d45b66c-plqnq" event={"ID":"3bad8bf0-d9c8-49db-9204-4692a18b0e26","Type":"ContainerStarted","Data":"e2708b0bfcd20d1b86f8ba3365ac6f4425ebb72404ab4cf0a970abeef7512e1b"} Apr 23 16:36:33.167055 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:33.167032 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpgxm" event={"ID":"c1dab98e-8f79-4056-94f4-9185da61ca34","Type":"ContainerStarted","Data":"3d15417b7d86131808e801a4a245d785530a0b2b82a0ead558b310b28e1be957"} Apr 23 16:36:34.172983 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:34.172781 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpgxm" event={"ID":"c1dab98e-8f79-4056-94f4-9185da61ca34","Type":"ContainerStarted","Data":"ed38a71a4288a9b505130be9cf94bdb8e939b7a305ab45521168dbea102fbc25"} Apr 23 16:36:34.172983 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:34.172828 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpgxm" event={"ID":"c1dab98e-8f79-4056-94f4-9185da61ca34","Type":"ContainerStarted","Data":"1eca06c22128d52b7dccd82faf1ecff5200f0dd0d0c6e703b21dd3b1069ba227"} Apr 23 16:36:34.193295 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:34.193243 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kpgxm" podStartSLOduration=66.269689453 podStartE2EDuration="1m7.193207871s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:36:32.729192965 +0000 UTC m=+66.460947377" lastFinishedPulling="2026-04-23 16:36:33.652711381 +0000 UTC m=+67.384465795" observedRunningTime="2026-04-23 16:36:34.191920891 +0000 UTC m=+67.923675319" watchObservedRunningTime="2026-04-23 16:36:34.193207871 +0000 UTC m=+67.924962285" Apr 23 16:36:34.193538 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:34.193510 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-779d45b66c-plqnq" podStartSLOduration=2.193501371 podStartE2EDuration="2.193501371s" podCreationTimestamp="2026-04-23 16:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:36:33.189308858 +0000 UTC m=+66.921063293" watchObservedRunningTime="2026-04-23 16:36:34.193501371 +0000 UTC m=+67.925255805" Apr 23 16:36:36.179399 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:36.179363 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pz92q" event={"ID":"92efbb3d-8bd0-413e-b306-331d80df0505","Type":"ContainerStarted","Data":"187df81b3e576033cec99d68dde962f23e3c6e415f8fee75a88f95c24bd5b359"} Apr 23 16:36:36.179715 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:36.179512 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:36:36.200478 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:36.200440 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pz92q" podStartSLOduration=66.275744727 podStartE2EDuration="1m9.200428603s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:36:33.040912311 +0000 UTC m=+66.772666742" lastFinishedPulling="2026-04-23 16:36:35.965596192 +0000 UTC m=+69.697350618" observedRunningTime="2026-04-23 16:36:36.198944595 +0000 UTC m=+69.930699041" watchObservedRunningTime="2026-04-23 16:36:36.200428603 +0000 UTC m=+69.932183068" Apr 23 16:36:40.260902 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:40.260867 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:42.398954 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:42.398928 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:42.399354 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:42.399007 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:42.403928 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:42.403910 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:43.200115 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:43.200088 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:36:43.249673 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:43.249645 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5857d7c9bd-8qtc9"] Apr 23 16:36:58.188135 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.188070 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66cf6458f5-6d758" podUID="a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" containerName="console" containerID="cri-o://ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d" gracePeriod=15 Apr 23 16:36:58.420704 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.420681 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66cf6458f5-6d758_a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab/console/0.log" Apr 23 16:36:58.420824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.420765 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:58.555850 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.555821 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-trusted-ca-bundle\") pod \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " Apr 23 16:36:58.555850 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.555862 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzq9r\" (UniqueName: \"kubernetes.io/projected/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-kube-api-access-xzq9r\") pod \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " Apr 23 16:36:58.556096 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.555911 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-serving-cert\") pod \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " Apr 23 16:36:58.556096 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556075 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-service-ca\") pod \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " Apr 23 16:36:58.556211 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556125 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-oauth-config\") pod \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " Apr 23 16:36:58.556211 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556188 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-config\") pod \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " Apr 23 16:36:58.556342 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556269 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-oauth-serving-cert\") pod \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\" (UID: \"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab\") " Apr 23 16:36:58.556389 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556275 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" (UID: "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:58.556480 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556454 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-service-ca" (OuterVolumeSpecName: "service-ca") pod "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" (UID: "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:58.556560 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556511 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-trusted-ca-bundle\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.556629 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556607 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-config" (OuterVolumeSpecName: "console-config") pod "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" (UID: "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:58.556838 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.556809 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" (UID: "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:58.558065 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.558045 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" (UID: "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:58.558159 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.558140 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-kube-api-access-xzq9r" (OuterVolumeSpecName: "kube-api-access-xzq9r") pod "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" (UID: "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab"). InnerVolumeSpecName "kube-api-access-xzq9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:58.558384 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.558363 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" (UID: "a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:58.657587 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.657555 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-serving-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.657587 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.657583 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-service-ca\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.657587 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.657594 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-oauth-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.657802 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.657602 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-console-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.657802 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.657612 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-oauth-serving-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.657802 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:58.657620 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzq9r\" (UniqueName: \"kubernetes.io/projected/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab-kube-api-access-xzq9r\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:36:59.239965 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.239944 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66cf6458f5-6d758_a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab/console/0.log" Apr 23 16:36:59.240347 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.239983 2563 generic.go:358] "Generic (PLEG): container finished" podID="a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" containerID="ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d" exitCode=2 Apr 23 16:36:59.240347 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.240017 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cf6458f5-6d758" event={"ID":"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab","Type":"ContainerDied","Data":"ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d"} Apr 23 16:36:59.240347 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.240056 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cf6458f5-6d758" event={"ID":"a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab","Type":"ContainerDied","Data":"39d3949d0bbe89237cc4f51ba4d14db85aa41b3aaa13d6e8dffe746ac8283a09"} Apr 23 16:36:59.240347 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.240072 2563 scope.go:117] "RemoveContainer" containerID="ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d" Apr 23 16:36:59.240347 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.240076 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cf6458f5-6d758" Apr 23 16:36:59.247777 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.247759 2563 scope.go:117] "RemoveContainer" containerID="ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d" Apr 23 16:36:59.248048 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:36:59.248029 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d\": container with ID starting with ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d not found: ID does not exist" containerID="ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d" Apr 23 16:36:59.248098 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.248056 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d"} err="failed to get container status \"ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d\": rpc error: code = NotFound desc = could not find container \"ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d\": container with ID starting with ee8e6516e5693f7ec003e161322ad224d11619ae549584662486a9e0a8be138d not found: ID does not exist" Apr 23 16:36:59.262797 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.262772 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66cf6458f5-6d758"] Apr 23 16:36:59.269882 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:36:59.269864 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66cf6458f5-6d758"] Apr 23 16:37:00.879329 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:00.879296 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" path="/var/lib/kubelet/pods/a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab/volumes" Apr 23 16:37:01.509247 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:01.509208 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_10b9ed9e-751b-41ff-8d50-5ab4091afc4b/init-config-reloader/0.log" Apr 23 16:37:01.708723 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:01.708665 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_10b9ed9e-751b-41ff-8d50-5ab4091afc4b/alertmanager/0.log" Apr 23 16:37:01.908026 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:01.907954 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_10b9ed9e-751b-41ff-8d50-5ab4091afc4b/config-reloader/0.log" Apr 23 16:37:02.108144 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:02.108088 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_10b9ed9e-751b-41ff-8d50-5ab4091afc4b/kube-rbac-proxy-web/0.log" Apr 23 16:37:02.307809 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:02.307766 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_10b9ed9e-751b-41ff-8d50-5ab4091afc4b/kube-rbac-proxy/0.log" Apr 23 16:37:02.508250 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:02.508184 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_10b9ed9e-751b-41ff-8d50-5ab4091afc4b/kube-rbac-proxy-metric/0.log" Apr 23 16:37:02.707895 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:02.707825 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_10b9ed9e-751b-41ff-8d50-5ab4091afc4b/prom-label-proxy/0.log" Apr 23 16:37:03.108975 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:03.108948 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m2ztp_568bdcdb-d09b-4f63-8775-e55efec84c8e/kube-state-metrics/0.log" Apr 23 16:37:03.308084 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:03.308057 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m2ztp_568bdcdb-d09b-4f63-8775-e55efec84c8e/kube-rbac-proxy-main/0.log" Apr 23 16:37:03.508121 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:03.508095 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m2ztp_568bdcdb-d09b-4f63-8775-e55efec84c8e/kube-rbac-proxy-self/0.log" Apr 23 16:37:05.307987 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:05.307959 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtq25_ffcd3636-7eaf-487e-b56b-788842ea51bb/init-textfile/0.log" Apr 23 16:37:05.508176 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:05.508153 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtq25_ffcd3636-7eaf-487e-b56b-788842ea51bb/node-exporter/0.log" Apr 23 16:37:05.708059 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:05.707963 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtq25_ffcd3636-7eaf-487e-b56b-788842ea51bb/kube-rbac-proxy/0.log" Apr 23 16:37:05.908477 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:05.908446 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-v8g6g_607bdd08-a8f9-4d7e-8f45-913d893a9763/kube-rbac-proxy-main/0.log" Apr 23 16:37:06.107131 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:06.107081 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-v8g6g_607bdd08-a8f9-4d7e-8f45-913d893a9763/kube-rbac-proxy-self/0.log" Apr 23 16:37:06.310185 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:06.310146 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-v8g6g_607bdd08-a8f9-4d7e-8f45-913d893a9763/openshift-state-metrics/0.log" Apr 23 16:37:07.184571 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:07.184544 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pz92q" Apr 23 16:37:08.268535 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.268478 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5857d7c9bd-8qtc9" podUID="2d21efa6-4336-43fd-a3c1-3c469e5983e1" containerName="console" containerID="cri-o://edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc" gracePeriod=15 Apr 23 16:37:08.499996 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.499976 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5857d7c9bd-8qtc9_2d21efa6-4336-43fd-a3c1-3c469e5983e1/console/0.log" Apr 23 16:37:08.500098 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.500030 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:37:08.636689 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.636615 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-config\") pod \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " Apr 23 16:37:08.636689 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.636675 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27sxz\" (UniqueName: \"kubernetes.io/projected/2d21efa6-4336-43fd-a3c1-3c469e5983e1-kube-api-access-27sxz\") pod \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " Apr 23 16:37:08.636885 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.636706 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-serving-cert\") pod \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " Apr 23 16:37:08.636885 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.636729 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-oauth-config\") pod \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " Apr 23 16:37:08.636885 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.636759 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-service-ca\") pod \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " Apr 23 16:37:08.636885 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.636813 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-oauth-serving-cert\") pod \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\" (UID: \"2d21efa6-4336-43fd-a3c1-3c469e5983e1\") " Apr 23 16:37:08.637087 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.637064 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-config" (OuterVolumeSpecName: "console-config") pod "2d21efa6-4336-43fd-a3c1-3c469e5983e1" (UID: "2d21efa6-4336-43fd-a3c1-3c469e5983e1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:08.637221 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.637195 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-service-ca" (OuterVolumeSpecName: "service-ca") pod "2d21efa6-4336-43fd-a3c1-3c469e5983e1" (UID: "2d21efa6-4336-43fd-a3c1-3c469e5983e1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:08.637319 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.637256 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2d21efa6-4336-43fd-a3c1-3c469e5983e1" (UID: "2d21efa6-4336-43fd-a3c1-3c469e5983e1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:08.638964 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.638936 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d21efa6-4336-43fd-a3c1-3c469e5983e1-kube-api-access-27sxz" (OuterVolumeSpecName: "kube-api-access-27sxz") pod "2d21efa6-4336-43fd-a3c1-3c469e5983e1" (UID: "2d21efa6-4336-43fd-a3c1-3c469e5983e1"). InnerVolumeSpecName "kube-api-access-27sxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:37:08.639058 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.638973 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2d21efa6-4336-43fd-a3c1-3c469e5983e1" (UID: "2d21efa6-4336-43fd-a3c1-3c469e5983e1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:08.639058 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.639005 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2d21efa6-4336-43fd-a3c1-3c469e5983e1" (UID: "2d21efa6-4336-43fd-a3c1-3c469e5983e1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:08.737515 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.737474 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-oauth-serving-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:08.737515 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.737511 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:08.737515 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.737522 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-27sxz\" (UniqueName: \"kubernetes.io/projected/2d21efa6-4336-43fd-a3c1-3c469e5983e1-kube-api-access-27sxz\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:08.737701 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.737531 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-serving-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:08.737701 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.737541 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d21efa6-4336-43fd-a3c1-3c469e5983e1-console-oauth-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:08.737701 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:08.737550 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d21efa6-4336-43fd-a3c1-3c469e5983e1-service-ca\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:09.268864 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.268837 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5857d7c9bd-8qtc9_2d21efa6-4336-43fd-a3c1-3c469e5983e1/console/0.log" Apr 23 16:37:09.269265 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.268874 2563 generic.go:358] "Generic (PLEG): container finished" podID="2d21efa6-4336-43fd-a3c1-3c469e5983e1" containerID="edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc" exitCode=2 Apr 23 16:37:09.269265 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.268920 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5857d7c9bd-8qtc9" event={"ID":"2d21efa6-4336-43fd-a3c1-3c469e5983e1","Type":"ContainerDied","Data":"edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc"} Apr 23 16:37:09.269265 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.268941 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5857d7c9bd-8qtc9" event={"ID":"2d21efa6-4336-43fd-a3c1-3c469e5983e1","Type":"ContainerDied","Data":"32b3dfd9b140707b72f4cab1955d238c3d2025f2fb5ad168d807b082929b4c5d"} Apr 23 16:37:09.269265 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.268942 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5857d7c9bd-8qtc9" Apr 23 16:37:09.269265 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.268955 2563 scope.go:117] "RemoveContainer" containerID="edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc" Apr 23 16:37:09.276381 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.276364 2563 scope.go:117] "RemoveContainer" containerID="edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc" Apr 23 16:37:09.276627 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:37:09.276605 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc\": container with ID starting with edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc not found: ID does not exist" containerID="edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc" Apr 23 16:37:09.276680 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.276635 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc"} err="failed to get container status \"edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc\": rpc error: code = NotFound desc = could not find container \"edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc\": container with ID starting with edea0ea5d43ddecf555ad60e320665db1e0644233842b89faf686b0263de16fc not found: ID does not exist" Apr 23 16:37:09.284412 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.284390 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5857d7c9bd-8qtc9"] Apr 23 16:37:09.288330 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:09.288312 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5857d7c9bd-8qtc9"] Apr 23 16:37:10.879282 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:10.879244 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d21efa6-4336-43fd-a3c1-3c469e5983e1" path="/var/lib/kubelet/pods/2d21efa6-4336-43fd-a3c1-3c469e5983e1/volumes" Apr 23 16:37:11.108652 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:11.108626 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-779d45b66c-plqnq_3bad8bf0-d9c8-49db-9204-4692a18b0e26/console/0.log" Apr 23 16:37:11.908198 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:11.908168 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ckfp9_56ca6981-847e-4fce-bb14-e0fa6f8fb697/serve-healthcheck-canary/0.log" Apr 23 16:37:37.142838 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.142800 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c585f9dfb-9njkf"] Apr 23 16:37:37.143294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.143048 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d21efa6-4336-43fd-a3c1-3c469e5983e1" containerName="console" Apr 23 16:37:37.143294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.143058 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d21efa6-4336-43fd-a3c1-3c469e5983e1" containerName="console" Apr 23 16:37:37.143294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.143066 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" containerName="console" Apr 23 16:37:37.143294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.143073 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" containerName="console" Apr 23 16:37:37.143294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.143118 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d21efa6-4336-43fd-a3c1-3c469e5983e1" containerName="console" Apr 23 16:37:37.143294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.143128 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5e1e9ca-a3d0-45ca-bc07-5e504e9b56ab" containerName="console" Apr 23 16:37:37.145882 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.145862 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.164363 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.164340 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c585f9dfb-9njkf"] Apr 23 16:37:37.234691 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.234657 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-oauth-config\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.234691 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.234689 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfwmd\" (UniqueName: \"kubernetes.io/projected/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-kube-api-access-jfwmd\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.234909 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.234717 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-serving-cert\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.234909 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.234802 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-config\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.234909 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.234830 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-oauth-serving-cert\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.234909 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.234859 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-trusted-ca-bundle\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.234909 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.234899 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-service-ca\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.335783 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.335748 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-trusted-ca-bundle\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.335959 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.335798 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-service-ca\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.335959 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.335857 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-oauth-config\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.335959 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.335882 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfwmd\" (UniqueName: \"kubernetes.io/projected/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-kube-api-access-jfwmd\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.335959 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.335913 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-serving-cert\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.336150 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.335973 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-config\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.336150 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.336003 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-oauth-serving-cert\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.336661 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.336630 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-service-ca\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.336800 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.336666 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-config\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.336800 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.336694 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-trusted-ca-bundle\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.336800 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.336733 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-oauth-serving-cert\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.338389 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.338362 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-serving-cert\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.338389 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.338377 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-oauth-config\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.347641 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.347619 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfwmd\" (UniqueName: \"kubernetes.io/projected/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-kube-api-access-jfwmd\") pod \"console-6c585f9dfb-9njkf\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.454556 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.454455 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:37.576284 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:37.576170 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c585f9dfb-9njkf"] Apr 23 16:37:37.578428 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:37:37.578402 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07bb1a7_e0b3_47c3_abc3_1f64da558ca7.slice/crio-e6a035e672e38c4497de9f16917fa9ec633a4511c64e993cd60f2164c3e7d958 WatchSource:0}: Error finding container e6a035e672e38c4497de9f16917fa9ec633a4511c64e993cd60f2164c3e7d958: Status 404 returned error can't find the container with id e6a035e672e38c4497de9f16917fa9ec633a4511c64e993cd60f2164c3e7d958 Apr 23 16:37:38.350248 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.350191 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c585f9dfb-9njkf" event={"ID":"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7","Type":"ContainerStarted","Data":"cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16"} Apr 23 16:37:38.350248 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.350242 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c585f9dfb-9njkf" event={"ID":"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7","Type":"ContainerStarted","Data":"e6a035e672e38c4497de9f16917fa9ec633a4511c64e993cd60f2164c3e7d958"} Apr 23 16:37:38.370502 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.370418 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c585f9dfb-9njkf" podStartSLOduration=1.370399878 podStartE2EDuration="1.370399878s" podCreationTimestamp="2026-04-23 16:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:38.3697984 +0000 UTC m=+132.101552834" watchObservedRunningTime="2026-04-23 16:37:38.370399878 +0000 UTC m=+132.102154314" Apr 23 16:37:38.771323 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.771288 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:37:38.771824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.771770 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="alertmanager" containerID="cri-o://8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551" gracePeriod=120 Apr 23 16:37:38.771966 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.771864 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy-web" containerID="cri-o://11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06" gracePeriod=120 Apr 23 16:37:38.771966 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.771894 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy" containerID="cri-o://3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74" gracePeriod=120 Apr 23 16:37:38.771966 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.771896 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="prom-label-proxy" containerID="cri-o://1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7" gracePeriod=120 Apr 23 16:37:38.771966 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.771908 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="config-reloader" containerID="cri-o://bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840" gracePeriod=120 Apr 23 16:37:38.771966 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:38.771840 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy-metric" containerID="cri-o://74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc" gracePeriod=120 Apr 23 16:37:39.355090 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:39.355060 2563 generic.go:358] "Generic (PLEG): container finished" podID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerID="1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7" exitCode=0 Apr 23 16:37:39.355090 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:39.355083 2563 generic.go:358] "Generic (PLEG): container finished" podID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerID="3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74" exitCode=0 Apr 23 16:37:39.355090 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:39.355090 2563 generic.go:358] "Generic (PLEG): container finished" podID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerID="bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840" exitCode=0 Apr 23 16:37:39.355090 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:39.355096 2563 generic.go:358] "Generic (PLEG): container finished" podID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerID="8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551" exitCode=0 Apr 23 16:37:39.355536 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:39.355133 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerDied","Data":"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7"} Apr 23 16:37:39.355536 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:39.355165 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerDied","Data":"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74"} Apr 23 16:37:39.355536 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:39.355177 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerDied","Data":"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840"} Apr 23 16:37:39.355536 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:39.355186 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerDied","Data":"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551"} Apr 23 16:37:40.001180 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.001157 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.055572 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055549 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-volume\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055705 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055601 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055705 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055629 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-tls-assets\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055705 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055660 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-out\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055705 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055687 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-main-db\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055917 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055716 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-web-config\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055917 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055753 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-web\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055917 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055779 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-metrics-client-ca\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055917 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055819 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055917 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055846 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055917 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055874 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-trusted-ca-bundle\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.055917 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055908 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-cluster-tls-config\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.056293 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.055971 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnzsd\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-kube-api-access-xnzsd\") pod \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\" (UID: \"10b9ed9e-751b-41ff-8d50-5ab4091afc4b\") " Apr 23 16:37:40.056350 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.056288 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:40.057010 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.056776 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:37:40.057010 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.056912 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:40.058411 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.058380 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:40.058799 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.058754 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:40.058973 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.058934 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:37:40.059581 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.059540 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-kube-api-access-xnzsd" (OuterVolumeSpecName: "kube-api-access-xnzsd") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "kube-api-access-xnzsd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:37:40.060039 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.059994 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-volume" (OuterVolumeSpecName: "config-volume") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:40.060133 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.060085 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:40.060428 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.060398 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:40.060428 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.060415 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-out" (OuterVolumeSpecName: "config-out") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:37:40.064441 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.064418 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:40.069482 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.069461 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-web-config" (OuterVolumeSpecName: "web-config") pod "10b9ed9e-751b-41ff-8d50-5ab4091afc4b" (UID: "10b9ed9e-751b-41ff-8d50-5ab4091afc4b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:40.157142 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157092 2563 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-volume\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157142 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157111 2563 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157142 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157121 2563 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-tls-assets\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157142 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157129 2563 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-config-out\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157142 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157139 2563 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-main-db\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157357 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157147 2563 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-web-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157357 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157156 2563 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157357 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157165 2563 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-metrics-client-ca\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157357 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157175 2563 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-main-tls\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157357 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157183 2563 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157357 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157193 2563 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157357 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157202 2563 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-cluster-tls-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.157357 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.157210 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnzsd\" (UniqueName: \"kubernetes.io/projected/10b9ed9e-751b-41ff-8d50-5ab4091afc4b-kube-api-access-xnzsd\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:37:40.360657 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.360628 2563 generic.go:358] "Generic (PLEG): container finished" podID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerID="74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc" exitCode=0 Apr 23 16:37:40.360657 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.360652 2563 generic.go:358] "Generic (PLEG): container finished" podID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerID="11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06" exitCode=0 Apr 23 16:37:40.361003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.360715 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerDied","Data":"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc"} Apr 23 16:37:40.361003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.360730 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.361003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.360753 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerDied","Data":"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06"} Apr 23 16:37:40.361003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.360763 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"10b9ed9e-751b-41ff-8d50-5ab4091afc4b","Type":"ContainerDied","Data":"80cdf69b6cb74b49b26a8eba947bc518b94c30e9d1da4c1adc35591476cf7407"} Apr 23 16:37:40.361003 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.360779 2563 scope.go:117] "RemoveContainer" containerID="1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7" Apr 23 16:37:40.368434 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.368415 2563 scope.go:117] "RemoveContainer" containerID="74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc" Apr 23 16:37:40.375246 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.375217 2563 scope.go:117] "RemoveContainer" containerID="3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74" Apr 23 16:37:40.381365 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.381348 2563 scope.go:117] "RemoveContainer" containerID="11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06" Apr 23 16:37:40.387394 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.387348 2563 scope.go:117] "RemoveContainer" containerID="bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840" Apr 23 16:37:40.388673 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.388650 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:37:40.393422 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.393404 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:37:40.395057 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.395038 2563 scope.go:117] "RemoveContainer" containerID="8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551" Apr 23 16:37:40.401966 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.401864 2563 scope.go:117] "RemoveContainer" containerID="ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238" Apr 23 16:37:40.408453 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.408439 2563 scope.go:117] "RemoveContainer" containerID="1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7" Apr 23 16:37:40.408699 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:37:40.408680 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7\": container with ID starting with 1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7 not found: ID does not exist" containerID="1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7" Apr 23 16:37:40.408744 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.408707 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7"} err="failed to get container status \"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7\": rpc error: code = NotFound desc = could not find container \"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7\": container with ID starting with 1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7 not found: ID does not exist" Apr 23 16:37:40.408744 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.408725 2563 scope.go:117] "RemoveContainer" containerID="74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc" Apr 23 16:37:40.408941 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:37:40.408924 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc\": container with ID starting with 74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc not found: ID does not exist" containerID="74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc" Apr 23 16:37:40.409002 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.408950 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc"} err="failed to get container status \"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc\": rpc error: code = NotFound desc = could not find container \"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc\": container with ID starting with 74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc not found: ID does not exist" Apr 23 16:37:40.409002 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.408974 2563 scope.go:117] "RemoveContainer" containerID="3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74" Apr 23 16:37:40.409209 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:37:40.409193 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74\": container with ID starting with 3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74 not found: ID does not exist" containerID="3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74" Apr 23 16:37:40.409260 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409212 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74"} err="failed to get container status \"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74\": rpc error: code = NotFound desc = could not find container \"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74\": container with ID starting with 3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74 not found: ID does not exist" Apr 23 16:37:40.409260 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409238 2563 scope.go:117] "RemoveContainer" containerID="11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06" Apr 23 16:37:40.409402 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:37:40.409389 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06\": container with ID starting with 11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06 not found: ID does not exist" containerID="11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06" Apr 23 16:37:40.409442 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409406 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06"} err="failed to get container status \"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06\": rpc error: code = NotFound desc = could not find container \"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06\": container with ID starting with 11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06 not found: ID does not exist" Apr 23 16:37:40.409442 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409417 2563 scope.go:117] "RemoveContainer" containerID="bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840" Apr 23 16:37:40.409564 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:37:40.409551 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840\": container with ID starting with bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840 not found: ID does not exist" containerID="bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840" Apr 23 16:37:40.409605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409565 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840"} err="failed to get container status \"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840\": rpc error: code = NotFound desc = could not find container \"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840\": container with ID starting with bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840 not found: ID does not exist" Apr 23 16:37:40.409605 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409575 2563 scope.go:117] "RemoveContainer" containerID="8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551" Apr 23 16:37:40.409734 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:37:40.409715 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551\": container with ID starting with 8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551 not found: ID does not exist" containerID="8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551" Apr 23 16:37:40.409774 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409738 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551"} err="failed to get container status \"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551\": rpc error: code = NotFound desc = could not find container \"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551\": container with ID starting with 8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551 not found: ID does not exist" Apr 23 16:37:40.409774 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409750 2563 scope.go:117] "RemoveContainer" containerID="ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238" Apr 23 16:37:40.409922 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:37:40.409907 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238\": container with ID starting with ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238 not found: ID does not exist" containerID="ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238" Apr 23 16:37:40.409961 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409930 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238"} err="failed to get container status \"ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238\": rpc error: code = NotFound desc = could not find container \"ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238\": container with ID starting with ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238 not found: ID does not exist" Apr 23 16:37:40.409961 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.409945 2563 scope.go:117] "RemoveContainer" containerID="1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7" Apr 23 16:37:40.410116 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410102 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7"} err="failed to get container status \"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7\": rpc error: code = NotFound desc = could not find container \"1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7\": container with ID starting with 1e6e1c1e7d1936c72472291365a3cd821900fc9d151f0d4893c0344f655f57a7 not found: ID does not exist" Apr 23 16:37:40.410155 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410116 2563 scope.go:117] "RemoveContainer" containerID="74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc" Apr 23 16:37:40.410353 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410335 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc"} err="failed to get container status \"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc\": rpc error: code = NotFound desc = could not find container \"74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc\": container with ID starting with 74fb2aa9385c0ea5e1e275f383b3da301a166e737519a9bef2cb10e65c2140fc not found: ID does not exist" Apr 23 16:37:40.410432 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410354 2563 scope.go:117] "RemoveContainer" containerID="3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74" Apr 23 16:37:40.410614 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410596 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74"} err="failed to get container status \"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74\": rpc error: code = NotFound desc = could not find container \"3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74\": container with ID starting with 3ca4d6d6aa2d24e20802fdda87920e4f8b0931096664ae6bba94da61bffb4d74 not found: ID does not exist" Apr 23 16:37:40.410662 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410616 2563 scope.go:117] "RemoveContainer" containerID="11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06" Apr 23 16:37:40.410819 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410799 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06"} err="failed to get container status \"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06\": rpc error: code = NotFound desc = could not find container \"11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06\": container with ID starting with 11b57b40962bda89550be775e68f5f99e731cf0c0f72382b038730d82b46ec06 not found: ID does not exist" Apr 23 16:37:40.410864 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410820 2563 scope.go:117] "RemoveContainer" containerID="bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840" Apr 23 16:37:40.411013 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.410993 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840"} err="failed to get container status \"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840\": rpc error: code = NotFound desc = could not find container \"bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840\": container with ID starting with bf3035b4ade2cdbbbee2e61c874a69e39f854ea129885a947e9e862694d64840 not found: ID does not exist" Apr 23 16:37:40.411072 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.411014 2563 scope.go:117] "RemoveContainer" containerID="8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551" Apr 23 16:37:40.411255 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.411220 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551"} err="failed to get container status \"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551\": rpc error: code = NotFound desc = could not find container \"8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551\": container with ID starting with 8b3c785cedc6f209e0fe0a19f1f8081f470171f2c6391fca214d0c9393ca7551 not found: ID does not exist" Apr 23 16:37:40.411326 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.411255 2563 scope.go:117] "RemoveContainer" containerID="ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238" Apr 23 16:37:40.411463 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.411445 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238"} err="failed to get container status \"ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238\": rpc error: code = NotFound desc = could not find container \"ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238\": container with ID starting with ec4fe82072e8db8afd1b5e9ed46b971620689062b217300f9c51e8387dd40238 not found: ID does not exist" Apr 23 16:37:40.425294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425273 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:37:40.425513 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425501 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="prom-label-proxy" Apr 23 16:37:40.425556 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425515 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="prom-label-proxy" Apr 23 16:37:40.425556 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425522 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="alertmanager" Apr 23 16:37:40.425556 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425528 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="alertmanager" Apr 23 16:37:40.425556 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425537 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="init-config-reloader" Apr 23 16:37:40.425556 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425546 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="init-config-reloader" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425562 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425567 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425574 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="config-reloader" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425580 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="config-reloader" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425585 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy-metric" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425590 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy-metric" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425603 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy-web" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425610 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy-web" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425652 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy-web" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425662 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425668 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="kube-rbac-proxy-metric" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425676 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="alertmanager" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425685 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="config-reloader" Apr 23 16:37:40.425708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.425694 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" containerName="prom-label-proxy" Apr 23 16:37:40.430708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.430692 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.433510 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.433489 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 16:37:40.433604 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.433497 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 16:37:40.433604 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.433528 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 16:37:40.433604 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.433539 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-x8t2t\"" Apr 23 16:37:40.433604 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.433549 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 16:37:40.433788 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.433643 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 16:37:40.434344 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.434328 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 16:37:40.434397 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.434385 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 16:37:40.434438 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.434407 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 16:37:40.442371 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.442349 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 16:37:40.447433 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.447411 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:37:40.459292 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459269 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-config-volume\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459379 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459298 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459379 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459315 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-web-config\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459379 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459333 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c8e366-8801-431c-8939-d761d6b95fcc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459544 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459442 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60c8e366-8801-431c-8939-d761d6b95fcc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459544 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459506 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/60c8e366-8801-431c-8939-d761d6b95fcc-config-out\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459644 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459539 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm827\" (UniqueName: \"kubernetes.io/projected/60c8e366-8801-431c-8939-d761d6b95fcc-kube-api-access-rm827\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459644 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459571 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/60c8e366-8801-431c-8939-d761d6b95fcc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459644 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459602 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459644 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459628 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459828 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459662 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/60c8e366-8801-431c-8939-d761d6b95fcc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459828 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459689 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.459828 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.459738 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560681 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560658 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-config-volume\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560797 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560688 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560797 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560708 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-web-config\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560797 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560726 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c8e366-8801-431c-8939-d761d6b95fcc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560797 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560763 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60c8e366-8801-431c-8939-d761d6b95fcc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560978 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560813 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/60c8e366-8801-431c-8939-d761d6b95fcc-config-out\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560978 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560841 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm827\" (UniqueName: \"kubernetes.io/projected/60c8e366-8801-431c-8939-d761d6b95fcc-kube-api-access-rm827\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560978 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560865 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/60c8e366-8801-431c-8939-d761d6b95fcc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560978 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560892 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.560978 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.560914 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.561250 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.561211 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/60c8e366-8801-431c-8939-d761d6b95fcc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.561352 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.561273 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.561352 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.561317 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.562297 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.561967 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60c8e366-8801-431c-8939-d761d6b95fcc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.562297 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.562084 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c8e366-8801-431c-8939-d761d6b95fcc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.563796 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.562632 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/60c8e366-8801-431c-8939-d761d6b95fcc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.563796 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.563784 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/60c8e366-8801-431c-8939-d761d6b95fcc-config-out\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.564141 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.564116 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.564565 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.564524 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.564684 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.564568 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-config-volume\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.564684 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.564552 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/60c8e366-8801-431c-8939-d761d6b95fcc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.564684 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.564642 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.564684 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.564659 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-web-config\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.565026 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.565008 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.565380 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.565363 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/60c8e366-8801-431c-8939-d761d6b95fcc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.570330 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.570308 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm827\" (UniqueName: \"kubernetes.io/projected/60c8e366-8801-431c-8939-d761d6b95fcc-kube-api-access-rm827\") pod \"alertmanager-main-0\" (UID: \"60c8e366-8801-431c-8939-d761d6b95fcc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.739405 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.739378 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:40.879583 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.879546 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b9ed9e-751b-41ff-8d50-5ab4091afc4b" path="/var/lib/kubelet/pods/10b9ed9e-751b-41ff-8d50-5ab4091afc4b/volumes" Apr 23 16:37:40.888516 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:40.888490 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:37:40.891296 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:37:40.891260 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60c8e366_8801_431c_8939_d761d6b95fcc.slice/crio-86e5682fafbaa3c8a3af19164ae886daee089c46c56a50d86a791d2c14557e8b WatchSource:0}: Error finding container 86e5682fafbaa3c8a3af19164ae886daee089c46c56a50d86a791d2c14557e8b: Status 404 returned error can't find the container with id 86e5682fafbaa3c8a3af19164ae886daee089c46c56a50d86a791d2c14557e8b Apr 23 16:37:41.365098 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:41.365057 2563 generic.go:358] "Generic (PLEG): container finished" podID="60c8e366-8801-431c-8939-d761d6b95fcc" containerID="3a6c61f2471e186e253651c4c5802c950318492f3bfdeed61f9205d1bf1efe3c" exitCode=0 Apr 23 16:37:41.365474 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:41.365139 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"60c8e366-8801-431c-8939-d761d6b95fcc","Type":"ContainerDied","Data":"3a6c61f2471e186e253651c4c5802c950318492f3bfdeed61f9205d1bf1efe3c"} Apr 23 16:37:41.365474 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:41.365175 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"60c8e366-8801-431c-8939-d761d6b95fcc","Type":"ContainerStarted","Data":"86e5682fafbaa3c8a3af19164ae886daee089c46c56a50d86a791d2c14557e8b"} Apr 23 16:37:42.371925 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:42.371891 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"60c8e366-8801-431c-8939-d761d6b95fcc","Type":"ContainerStarted","Data":"048de65c138b02c8bee1aa06ebc3bd7b718f43cbd0449f88e12d33361aaceb6d"} Apr 23 16:37:42.371925 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:42.371926 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"60c8e366-8801-431c-8939-d761d6b95fcc","Type":"ContainerStarted","Data":"0fb2098e2bd40698e5092627c6302c272e960ff819ce2fdb9698e52cdfdae77c"} Apr 23 16:37:42.372402 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:42.371941 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"60c8e366-8801-431c-8939-d761d6b95fcc","Type":"ContainerStarted","Data":"a6385593d1a9c6b8bae392a9ef8099057aa765f317611e2defb268adb7c41234"} Apr 23 16:37:42.372402 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:42.371954 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"60c8e366-8801-431c-8939-d761d6b95fcc","Type":"ContainerStarted","Data":"e31e8290aba7f5feb2ffe692beb6420ed270dcc448ca4950beba206b4da92242"} Apr 23 16:37:42.372402 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:42.371964 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"60c8e366-8801-431c-8939-d761d6b95fcc","Type":"ContainerStarted","Data":"1ec171ac5e1274e8ff81216b56f04ec5bf9abf33b37a9c64635ca76b380ed030"} Apr 23 16:37:42.372402 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:42.371974 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"60c8e366-8801-431c-8939-d761d6b95fcc","Type":"ContainerStarted","Data":"45f1e1d4238c0e59f53d67b134215fa87e12f06214aba6aa3866568e152123cc"} Apr 23 16:37:42.416460 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:42.416411 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.416396455 podStartE2EDuration="2.416396455s" podCreationTimestamp="2026-04-23 16:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:42.413843591 +0000 UTC m=+136.145598036" watchObservedRunningTime="2026-04-23 16:37:42.416396455 +0000 UTC m=+136.148150946" Apr 23 16:37:47.454855 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:47.454814 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:47.454855 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:47.454862 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:47.459675 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:47.459653 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:48.392706 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:48.392678 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:37:48.454249 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:37:48.454203 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-779d45b66c-plqnq"] Apr 23 16:38:08.952067 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:08.951985 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jkzhw"] Apr 23 16:38:08.955293 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:08.955273 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:08.960346 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:08.960325 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:38:08.966300 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:08.966276 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jkzhw"] Apr 23 16:38:09.076311 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.076277 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0e44b1d-5f76-4a34-9664-f73402be5e2d-kubelet-config\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.076311 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.076309 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0e44b1d-5f76-4a34-9664-f73402be5e2d-dbus\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.076515 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.076338 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0e44b1d-5f76-4a34-9664-f73402be5e2d-original-pull-secret\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.177466 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.177431 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0e44b1d-5f76-4a34-9664-f73402be5e2d-kubelet-config\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.177466 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.177465 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0e44b1d-5f76-4a34-9664-f73402be5e2d-dbus\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.177650 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.177492 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0e44b1d-5f76-4a34-9664-f73402be5e2d-original-pull-secret\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.177650 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.177558 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0e44b1d-5f76-4a34-9664-f73402be5e2d-kubelet-config\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.177650 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.177628 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0e44b1d-5f76-4a34-9664-f73402be5e2d-dbus\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.179707 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.179682 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0e44b1d-5f76-4a34-9664-f73402be5e2d-original-pull-secret\") pod \"global-pull-secret-syncer-jkzhw\" (UID: \"d0e44b1d-5f76-4a34-9664-f73402be5e2d\") " pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.263806 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.263781 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jkzhw" Apr 23 16:38:09.377976 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.377946 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jkzhw"] Apr 23 16:38:09.380443 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:38:09.380419 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0e44b1d_5f76_4a34_9664_f73402be5e2d.slice/crio-e27823d1641c0b99d3da1d5b08a819a345ce5dcecaee1881a4f6b1dd5456cd68 WatchSource:0}: Error finding container e27823d1641c0b99d3da1d5b08a819a345ce5dcecaee1881a4f6b1dd5456cd68: Status 404 returned error can't find the container with id e27823d1641c0b99d3da1d5b08a819a345ce5dcecaee1881a4f6b1dd5456cd68 Apr 23 16:38:09.451026 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:09.450995 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jkzhw" event={"ID":"d0e44b1d-5f76-4a34-9664-f73402be5e2d","Type":"ContainerStarted","Data":"e27823d1641c0b99d3da1d5b08a819a345ce5dcecaee1881a4f6b1dd5456cd68"} Apr 23 16:38:13.472801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.472732 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-779d45b66c-plqnq" podUID="3bad8bf0-d9c8-49db-9204-4692a18b0e26" containerName="console" containerID="cri-o://7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7" gracePeriod=15 Apr 23 16:38:13.757128 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.757105 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-779d45b66c-plqnq_3bad8bf0-d9c8-49db-9204-4692a18b0e26/console/0.log" Apr 23 16:38:13.757262 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.757168 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:38:13.814513 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.814480 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crslg\" (UniqueName: \"kubernetes.io/projected/3bad8bf0-d9c8-49db-9204-4692a18b0e26-kube-api-access-crslg\") pod \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " Apr 23 16:38:13.814674 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.814528 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-config\") pod \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " Apr 23 16:38:13.814674 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.814573 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-oauth-config\") pod \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " Apr 23 16:38:13.814674 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.814599 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-serving-cert\") pod \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " Apr 23 16:38:13.814674 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.814643 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-service-ca\") pod \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " Apr 23 16:38:13.814674 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.814666 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-oauth-serving-cert\") pod \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " Apr 23 16:38:13.814935 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.814709 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-trusted-ca-bundle\") pod \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\" (UID: \"3bad8bf0-d9c8-49db-9204-4692a18b0e26\") " Apr 23 16:38:13.814989 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.814967 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-config" (OuterVolumeSpecName: "console-config") pod "3bad8bf0-d9c8-49db-9204-4692a18b0e26" (UID: "3bad8bf0-d9c8-49db-9204-4692a18b0e26"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:13.815259 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.815212 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3bad8bf0-d9c8-49db-9204-4692a18b0e26" (UID: "3bad8bf0-d9c8-49db-9204-4692a18b0e26"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:13.815259 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.815244 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-service-ca" (OuterVolumeSpecName: "service-ca") pod "3bad8bf0-d9c8-49db-9204-4692a18b0e26" (UID: "3bad8bf0-d9c8-49db-9204-4692a18b0e26"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:13.815414 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.815302 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3bad8bf0-d9c8-49db-9204-4692a18b0e26" (UID: "3bad8bf0-d9c8-49db-9204-4692a18b0e26"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:13.816713 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.816679 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3bad8bf0-d9c8-49db-9204-4692a18b0e26" (UID: "3bad8bf0-d9c8-49db-9204-4692a18b0e26"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:13.816800 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.816741 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3bad8bf0-d9c8-49db-9204-4692a18b0e26" (UID: "3bad8bf0-d9c8-49db-9204-4692a18b0e26"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:13.817245 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.817199 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bad8bf0-d9c8-49db-9204-4692a18b0e26-kube-api-access-crslg" (OuterVolumeSpecName: "kube-api-access-crslg") pod "3bad8bf0-d9c8-49db-9204-4692a18b0e26" (UID: "3bad8bf0-d9c8-49db-9204-4692a18b0e26"). InnerVolumeSpecName "kube-api-access-crslg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:13.915742 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.915704 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crslg\" (UniqueName: \"kubernetes.io/projected/3bad8bf0-d9c8-49db-9204-4692a18b0e26-kube-api-access-crslg\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:38:13.915742 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.915743 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:38:13.915934 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.915761 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-oauth-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:38:13.915934 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.915776 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bad8bf0-d9c8-49db-9204-4692a18b0e26-console-serving-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:38:13.915934 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.915790 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-service-ca\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:38:13.915934 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.915805 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-oauth-serving-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:38:13.915934 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:13.915817 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bad8bf0-d9c8-49db-9204-4692a18b0e26-trusted-ca-bundle\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:38:14.468504 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.468479 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-779d45b66c-plqnq_3bad8bf0-d9c8-49db-9204-4692a18b0e26/console/0.log" Apr 23 16:38:14.468709 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.468519 2563 generic.go:358] "Generic (PLEG): container finished" podID="3bad8bf0-d9c8-49db-9204-4692a18b0e26" containerID="7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7" exitCode=2 Apr 23 16:38:14.468709 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.468553 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779d45b66c-plqnq" event={"ID":"3bad8bf0-d9c8-49db-9204-4692a18b0e26","Type":"ContainerDied","Data":"7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7"} Apr 23 16:38:14.468709 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.468593 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779d45b66c-plqnq" event={"ID":"3bad8bf0-d9c8-49db-9204-4692a18b0e26","Type":"ContainerDied","Data":"e2708b0bfcd20d1b86f8ba3365ac6f4425ebb72404ab4cf0a970abeef7512e1b"} Apr 23 16:38:14.468709 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.468593 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779d45b66c-plqnq" Apr 23 16:38:14.468709 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.468629 2563 scope.go:117] "RemoveContainer" containerID="7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7" Apr 23 16:38:14.469911 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.469885 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jkzhw" event={"ID":"d0e44b1d-5f76-4a34-9664-f73402be5e2d","Type":"ContainerStarted","Data":"d9b48083acffbdea7a47d092f150936f3c702bcb321a9bcc51eb261d496943aa"} Apr 23 16:38:14.476492 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.476271 2563 scope.go:117] "RemoveContainer" containerID="7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7" Apr 23 16:38:14.476722 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:38:14.476557 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7\": container with ID starting with 7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7 not found: ID does not exist" containerID="7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7" Apr 23 16:38:14.476722 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.476582 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7"} err="failed to get container status \"7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7\": rpc error: code = NotFound desc = could not find container \"7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7\": container with ID starting with 7a6af2c2a570159fe96e80e8b89c0b14fb340aff6194ea4f549dcd5767f752f7 not found: ID does not exist" Apr 23 16:38:14.506168 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.506125 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jkzhw" podStartSLOduration=2.222392586 podStartE2EDuration="6.506114291s" podCreationTimestamp="2026-04-23 16:38:08 +0000 UTC" firstStartedPulling="2026-04-23 16:38:09.382472117 +0000 UTC m=+163.114226528" lastFinishedPulling="2026-04-23 16:38:13.666193811 +0000 UTC m=+167.397948233" observedRunningTime="2026-04-23 16:38:14.491721484 +0000 UTC m=+168.223475918" watchObservedRunningTime="2026-04-23 16:38:14.506114291 +0000 UTC m=+168.237868724" Apr 23 16:38:14.506307 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.506293 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-779d45b66c-plqnq"] Apr 23 16:38:14.510624 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.510604 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-779d45b66c-plqnq"] Apr 23 16:38:14.879139 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:38:14.879103 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bad8bf0-d9c8-49db-9204-4692a18b0e26" path="/var/lib/kubelet/pods/3bad8bf0-d9c8-49db-9204-4692a18b0e26/volumes" Apr 23 16:39:24.297553 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.297520 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd"] Apr 23 16:39:24.297983 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.297786 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bad8bf0-d9c8-49db-9204-4692a18b0e26" containerName="console" Apr 23 16:39:24.297983 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.297796 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bad8bf0-d9c8-49db-9204-4692a18b0e26" containerName="console" Apr 23 16:39:24.297983 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.297835 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bad8bf0-d9c8-49db-9204-4692a18b0e26" containerName="console" Apr 23 16:39:24.300777 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.300760 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.303453 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.303433 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:39:24.304660 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.304643 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lrqtm\"" Apr 23 16:39:24.304774 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.304746 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:39:24.308858 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.308827 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd"] Apr 23 16:39:24.386031 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.385998 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.386167 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.386047 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.386167 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.386072 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttbpg\" (UniqueName: \"kubernetes.io/projected/19f6f7ba-9935-420d-b5e5-e0f2202d0730-kube-api-access-ttbpg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.487353 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.487326 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.487455 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.487372 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.487568 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.487550 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttbpg\" (UniqueName: \"kubernetes.io/projected/19f6f7ba-9935-420d-b5e5-e0f2202d0730-kube-api-access-ttbpg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.487714 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.487695 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.487778 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.487743 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.496349 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.496328 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttbpg\" (UniqueName: \"kubernetes.io/projected/19f6f7ba-9935-420d-b5e5-e0f2202d0730-kube-api-access-ttbpg\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.610461 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.610411 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:24.728881 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:24.728852 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd"] Apr 23 16:39:24.731897 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:39:24.731870 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f6f7ba_9935_420d_b5e5_e0f2202d0730.slice/crio-cfb6d68143b56a7a997168f18e25d38f3dd0d3ef38fdef368d3c475b5cb3e89e WatchSource:0}: Error finding container cfb6d68143b56a7a997168f18e25d38f3dd0d3ef38fdef368d3c475b5cb3e89e: Status 404 returned error can't find the container with id cfb6d68143b56a7a997168f18e25d38f3dd0d3ef38fdef368d3c475b5cb3e89e Apr 23 16:39:25.653983 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:25.653942 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" event={"ID":"19f6f7ba-9935-420d-b5e5-e0f2202d0730","Type":"ContainerStarted","Data":"cfb6d68143b56a7a997168f18e25d38f3dd0d3ef38fdef368d3c475b5cb3e89e"} Apr 23 16:39:30.671114 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:30.671083 2563 generic.go:358] "Generic (PLEG): container finished" podID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerID="6573b16e78919385e20a52cf1462d399464306e0aa0163c02f730e5c9cdd4fb9" exitCode=0 Apr 23 16:39:30.671500 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:30.671168 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" event={"ID":"19f6f7ba-9935-420d-b5e5-e0f2202d0730","Type":"ContainerDied","Data":"6573b16e78919385e20a52cf1462d399464306e0aa0163c02f730e5c9cdd4fb9"} Apr 23 16:39:33.681513 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:33.681479 2563 generic.go:358] "Generic (PLEG): container finished" podID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerID="f5562da98e092a9535c800ff0e8cc86901d7efa3c8bec1881409588096397ca8" exitCode=0 Apr 23 16:39:33.681862 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:33.681524 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" event={"ID":"19f6f7ba-9935-420d-b5e5-e0f2202d0730","Type":"ContainerDied","Data":"f5562da98e092a9535c800ff0e8cc86901d7efa3c8bec1881409588096397ca8"} Apr 23 16:39:40.702428 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:40.702394 2563 generic.go:358] "Generic (PLEG): container finished" podID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerID="4eae8d73369cb0bd5bc56a060338b719dc0ba938cddd13c8f43ec44b23f7b03d" exitCode=0 Apr 23 16:39:40.702791 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:40.702475 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" event={"ID":"19f6f7ba-9935-420d-b5e5-e0f2202d0730","Type":"ContainerDied","Data":"4eae8d73369cb0bd5bc56a060338b719dc0ba938cddd13c8f43ec44b23f7b03d"} Apr 23 16:39:41.815661 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:41.815640 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:41.920734 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:41.920708 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-util\") pod \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " Apr 23 16:39:41.920834 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:41.920742 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-bundle\") pod \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " Apr 23 16:39:41.920834 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:41.920789 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttbpg\" (UniqueName: \"kubernetes.io/projected/19f6f7ba-9935-420d-b5e5-e0f2202d0730-kube-api-access-ttbpg\") pod \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\" (UID: \"19f6f7ba-9935-420d-b5e5-e0f2202d0730\") " Apr 23 16:39:41.921339 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:41.921302 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-bundle" (OuterVolumeSpecName: "bundle") pod "19f6f7ba-9935-420d-b5e5-e0f2202d0730" (UID: "19f6f7ba-9935-420d-b5e5-e0f2202d0730"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:41.922878 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:41.922856 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f6f7ba-9935-420d-b5e5-e0f2202d0730-kube-api-access-ttbpg" (OuterVolumeSpecName: "kube-api-access-ttbpg") pod "19f6f7ba-9935-420d-b5e5-e0f2202d0730" (UID: "19f6f7ba-9935-420d-b5e5-e0f2202d0730"). InnerVolumeSpecName "kube-api-access-ttbpg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:41.926005 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:41.925978 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-util" (OuterVolumeSpecName: "util") pod "19f6f7ba-9935-420d-b5e5-e0f2202d0730" (UID: "19f6f7ba-9935-420d-b5e5-e0f2202d0730"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:42.022066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:42.022046 2563 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-bundle\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:39:42.022164 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:42.022068 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ttbpg\" (UniqueName: \"kubernetes.io/projected/19f6f7ba-9935-420d-b5e5-e0f2202d0730-kube-api-access-ttbpg\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:39:42.022164 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:42.022078 2563 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f6f7ba-9935-420d-b5e5-e0f2202d0730-util\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:39:42.709494 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:42.709459 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" event={"ID":"19f6f7ba-9935-420d-b5e5-e0f2202d0730","Type":"ContainerDied","Data":"cfb6d68143b56a7a997168f18e25d38f3dd0d3ef38fdef368d3c475b5cb3e89e"} Apr 23 16:39:42.709494 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:42.709498 2563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfb6d68143b56a7a997168f18e25d38f3dd0d3ef38fdef368d3c475b5cb3e89e" Apr 23 16:39:42.709728 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:42.709467 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvk7bd" Apr 23 16:39:46.236407 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.236374 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq"] Apr 23 16:39:46.236801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.236622 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerName="extract" Apr 23 16:39:46.236801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.236633 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerName="extract" Apr 23 16:39:46.236801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.236655 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerName="util" Apr 23 16:39:46.236801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.236660 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerName="util" Apr 23 16:39:46.236801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.236666 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerName="pull" Apr 23 16:39:46.236801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.236672 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerName="pull" Apr 23 16:39:46.236801 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.236710 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="19f6f7ba-9935-420d-b5e5-e0f2202d0730" containerName="extract" Apr 23 16:39:46.238809 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.238794 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:46.241479 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.241454 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-r9z2r\"" Apr 23 16:39:46.241581 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.241494 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 16:39:46.241581 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.241561 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 16:39:46.241581 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.241559 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 16:39:46.251680 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.251658 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq"] Apr 23 16:39:46.352552 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.352512 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbn49\" (UniqueName: \"kubernetes.io/projected/99535bba-5fb0-474c-b901-b6a16ae651ec-kube-api-access-fbn49\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq\" (UID: \"99535bba-5fb0-474c-b901-b6a16ae651ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:46.352725 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.352598 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/99535bba-5fb0-474c-b901-b6a16ae651ec-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq\" (UID: \"99535bba-5fb0-474c-b901-b6a16ae651ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:46.452946 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.452916 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/99535bba-5fb0-474c-b901-b6a16ae651ec-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq\" (UID: \"99535bba-5fb0-474c-b901-b6a16ae651ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:46.453137 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.452963 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbn49\" (UniqueName: \"kubernetes.io/projected/99535bba-5fb0-474c-b901-b6a16ae651ec-kube-api-access-fbn49\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq\" (UID: \"99535bba-5fb0-474c-b901-b6a16ae651ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:46.455343 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.455318 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/99535bba-5fb0-474c-b901-b6a16ae651ec-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq\" (UID: \"99535bba-5fb0-474c-b901-b6a16ae651ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:46.461491 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.461468 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbn49\" (UniqueName: \"kubernetes.io/projected/99535bba-5fb0-474c-b901-b6a16ae651ec-kube-api-access-fbn49\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq\" (UID: \"99535bba-5fb0-474c-b901-b6a16ae651ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:46.548850 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.548768 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:46.671170 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.671132 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq"] Apr 23 16:39:46.675073 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:39:46.675044 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99535bba_5fb0_474c_b901_b6a16ae651ec.slice/crio-c8b69fdc68e3526254e2f8a4274b5bf125bf6c096dbe024a567d7480f839f833 WatchSource:0}: Error finding container c8b69fdc68e3526254e2f8a4274b5bf125bf6c096dbe024a567d7480f839f833: Status 404 returned error can't find the container with id c8b69fdc68e3526254e2f8a4274b5bf125bf6c096dbe024a567d7480f839f833 Apr 23 16:39:46.726422 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:46.726386 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" event={"ID":"99535bba-5fb0-474c-b901-b6a16ae651ec","Type":"ContainerStarted","Data":"c8b69fdc68e3526254e2f8a4274b5bf125bf6c096dbe024a567d7480f839f833"} Apr 23 16:39:50.533185 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.533146 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ffv29"] Apr 23 16:39:50.535319 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.535294 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.538569 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.538550 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 16:39:50.538569 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.538560 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-2n8nr\"" Apr 23 16:39:50.538736 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.538556 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 16:39:50.553084 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.553060 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ffv29"] Apr 23 16:39:50.585153 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.585111 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.585348 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.585239 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f8bde9bd-af78-417f-a38a-e6e93856d47d-cabundle0\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.585348 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.585283 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2d4l\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-kube-api-access-s2d4l\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.685930 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.685898 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f8bde9bd-af78-417f-a38a-e6e93856d47d-cabundle0\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.685930 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.685936 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2d4l\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-kube-api-access-s2d4l\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.686191 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.685968 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.686191 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:50.686066 2563 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 23 16:39:50.686191 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:50.686082 2563 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:39:50.686191 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:50.686089 2563 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:39:50.686191 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:50.686102 2563 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ffv29: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 23 16:39:50.686191 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:50.686155 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates podName:f8bde9bd-af78-417f-a38a-e6e93856d47d nodeName:}" failed. No retries permitted until 2026-04-23 16:39:51.186137654 +0000 UTC m=+264.917892068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates") pod "keda-operator-ffbb595cb-ffv29" (UID: "f8bde9bd-af78-417f-a38a-e6e93856d47d") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 23 16:39:50.686563 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.686543 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f8bde9bd-af78-417f-a38a-e6e93856d47d-cabundle0\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.703283 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.703250 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2d4l\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-kube-api-access-s2d4l\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:50.739703 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.739673 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" event={"ID":"99535bba-5fb0-474c-b901-b6a16ae651ec","Type":"ContainerStarted","Data":"0c86156db23a87d516383907dc2c943cd9dedb59a42052a71f834efc0cd53361"} Apr 23 16:39:50.739866 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.739791 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:39:50.761200 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.761151 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" podStartSLOduration=1.5015469430000001 podStartE2EDuration="4.761137363s" podCreationTimestamp="2026-04-23 16:39:46 +0000 UTC" firstStartedPulling="2026-04-23 16:39:46.676652714 +0000 UTC m=+260.408407124" lastFinishedPulling="2026-04-23 16:39:49.936243133 +0000 UTC m=+263.667997544" observedRunningTime="2026-04-23 16:39:50.76060572 +0000 UTC m=+264.492360163" watchObservedRunningTime="2026-04-23 16:39:50.761137363 +0000 UTC m=+264.492891795" Apr 23 16:39:50.845527 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.845451 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl"] Apr 23 16:39:50.847611 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.847593 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:50.850188 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.850168 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 16:39:50.857019 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.856998 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl"] Apr 23 16:39:50.989283 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.989248 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:50.989431 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.989410 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whssl\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-kube-api-access-whssl\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:50.989488 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:50.989459 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:51.062884 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.062849 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-x652r"] Apr 23 16:39:51.065020 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.064990 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:51.068557 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.068535 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 16:39:51.078939 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.075900 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-x652r"] Apr 23 16:39:51.090102 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.090067 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whssl\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-kube-api-access-whssl\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:51.090102 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.090105 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:51.090324 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.090146 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:51.090324 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.090296 2563 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:39:51.090324 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.090316 2563 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:39:51.090473 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.090337 2563 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl: references non-existent secret key: tls.crt Apr 23 16:39:51.090473 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.090401 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates podName:ea6c6f6f-9978-422b-998b-b6d0bb182ba7 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:51.590382346 +0000 UTC m=+265.322136765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates") pod "keda-metrics-apiserver-7c9f485588-8zkfl" (UID: "ea6c6f6f-9978-422b-998b-b6d0bb182ba7") : references non-existent secret key: tls.crt Apr 23 16:39:51.090595 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.090478 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:51.104785 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.104723 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whssl\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-kube-api-access-whssl\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:51.191513 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.191460 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:51.191708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.191541 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-certificates\") pod \"keda-admission-cf49989db-x652r\" (UID: \"0928f513-1c0a-4dab-9ffb-474512079aa6\") " pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:51.191708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.191613 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk9lx\" (UniqueName: \"kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-kube-api-access-rk9lx\") pod \"keda-admission-cf49989db-x652r\" (UID: \"0928f513-1c0a-4dab-9ffb-474512079aa6\") " pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:51.191708 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.191618 2563 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:39:51.191708 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.191642 2563 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:39:51.191708 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.191654 2563 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ffv29: references non-existent secret key: ca.crt Apr 23 16:39:51.191708 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.191708 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates podName:f8bde9bd-af78-417f-a38a-e6e93856d47d nodeName:}" failed. No retries permitted until 2026-04-23 16:39:52.191687717 +0000 UTC m=+265.923442132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates") pod "keda-operator-ffbb595cb-ffv29" (UID: "f8bde9bd-af78-417f-a38a-e6e93856d47d") : references non-existent secret key: ca.crt Apr 23 16:39:51.292726 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.292688 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk9lx\" (UniqueName: \"kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-kube-api-access-rk9lx\") pod \"keda-admission-cf49989db-x652r\" (UID: \"0928f513-1c0a-4dab-9ffb-474512079aa6\") " pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:51.292905 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.292770 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-certificates\") pod \"keda-admission-cf49989db-x652r\" (UID: \"0928f513-1c0a-4dab-9ffb-474512079aa6\") " pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:51.292905 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.292877 2563 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 23 16:39:51.292905 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.292895 2563 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-x652r: secret "keda-admission-webhooks-certs" not found Apr 23 16:39:51.293007 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.292946 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-certificates podName:0928f513-1c0a-4dab-9ffb-474512079aa6 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:51.79292952 +0000 UTC m=+265.524683933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-certificates") pod "keda-admission-cf49989db-x652r" (UID: "0928f513-1c0a-4dab-9ffb-474512079aa6") : secret "keda-admission-webhooks-certs" not found Apr 23 16:39:51.302899 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.302870 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk9lx\" (UniqueName: \"kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-kube-api-access-rk9lx\") pod \"keda-admission-cf49989db-x652r\" (UID: \"0928f513-1c0a-4dab-9ffb-474512079aa6\") " pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:51.595329 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.595296 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:51.595799 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.595437 2563 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:39:51.595799 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.595461 2563 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:39:51.595799 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.595483 2563 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl: references non-existent secret key: tls.crt Apr 23 16:39:51.595799 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:51.595552 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates podName:ea6c6f6f-9978-422b-998b-b6d0bb182ba7 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:52.595531601 +0000 UTC m=+266.327286016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates") pod "keda-metrics-apiserver-7c9f485588-8zkfl" (UID: "ea6c6f6f-9978-422b-998b-b6d0bb182ba7") : references non-existent secret key: tls.crt Apr 23 16:39:51.797029 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.796986 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-certificates\") pod \"keda-admission-cf49989db-x652r\" (UID: \"0928f513-1c0a-4dab-9ffb-474512079aa6\") " pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:51.799575 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.799554 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0928f513-1c0a-4dab-9ffb-474512079aa6-certificates\") pod \"keda-admission-cf49989db-x652r\" (UID: \"0928f513-1c0a-4dab-9ffb-474512079aa6\") " pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:51.980895 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:51.980864 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:52.102645 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:52.102612 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-x652r"] Apr 23 16:39:52.106463 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:39:52.106427 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0928f513_1c0a_4dab_9ffb_474512079aa6.slice/crio-ff26e06e222898849d90d8edd914659e7d632f4d00b7a4cd436fff0602542ab5 WatchSource:0}: Error finding container ff26e06e222898849d90d8edd914659e7d632f4d00b7a4cd436fff0602542ab5: Status 404 returned error can't find the container with id ff26e06e222898849d90d8edd914659e7d632f4d00b7a4cd436fff0602542ab5 Apr 23 16:39:52.201879 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:52.201829 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:52.202060 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:52.202005 2563 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:39:52.202060 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:52.202028 2563 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:39:52.202060 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:52.202040 2563 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ffv29: references non-existent secret key: ca.crt Apr 23 16:39:52.202172 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:52.202106 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates podName:f8bde9bd-af78-417f-a38a-e6e93856d47d nodeName:}" failed. No retries permitted until 2026-04-23 16:39:54.202087076 +0000 UTC m=+267.933841492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates") pod "keda-operator-ffbb595cb-ffv29" (UID: "f8bde9bd-af78-417f-a38a-e6e93856d47d") : references non-existent secret key: ca.crt Apr 23 16:39:52.604691 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:52.604657 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:52.605160 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:52.604842 2563 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:39:52.605160 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:52.604866 2563 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:39:52.605160 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:52.604893 2563 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl: references non-existent secret key: tls.crt Apr 23 16:39:52.605160 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:39:52.604981 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates podName:ea6c6f6f-9978-422b-998b-b6d0bb182ba7 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:54.604957013 +0000 UTC m=+268.336711639 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates") pod "keda-metrics-apiserver-7c9f485588-8zkfl" (UID: "ea6c6f6f-9978-422b-998b-b6d0bb182ba7") : references non-existent secret key: tls.crt Apr 23 16:39:52.745560 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:52.745522 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-x652r" event={"ID":"0928f513-1c0a-4dab-9ffb-474512079aa6","Type":"ContainerStarted","Data":"ff26e06e222898849d90d8edd914659e7d632f4d00b7a4cd436fff0602542ab5"} Apr 23 16:39:53.749819 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:53.749784 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-x652r" event={"ID":"0928f513-1c0a-4dab-9ffb-474512079aa6","Type":"ContainerStarted","Data":"4552f3176200a69106d643ac3bafa1b157ea78ba7eeb2e960a47c53f34f09568"} Apr 23 16:39:53.750174 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:53.749840 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:39:53.769449 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:53.769404 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-x652r" podStartSLOduration=1.288529344 podStartE2EDuration="2.769390729s" podCreationTimestamp="2026-04-23 16:39:51 +0000 UTC" firstStartedPulling="2026-04-23 16:39:52.108486724 +0000 UTC m=+265.840241134" lastFinishedPulling="2026-04-23 16:39:53.589348102 +0000 UTC m=+267.321102519" observedRunningTime="2026-04-23 16:39:53.768061501 +0000 UTC m=+267.499815956" watchObservedRunningTime="2026-04-23 16:39:53.769390729 +0000 UTC m=+267.501145212" Apr 23 16:39:54.219241 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.219203 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:54.221537 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.221520 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8bde9bd-af78-417f-a38a-e6e93856d47d-certificates\") pod \"keda-operator-ffbb595cb-ffv29\" (UID: \"f8bde9bd-af78-417f-a38a-e6e93856d47d\") " pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:54.444765 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.444729 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:54.577269 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.577216 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ffv29"] Apr 23 16:39:54.579265 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:39:54.579236 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8bde9bd_af78_417f_a38a_e6e93856d47d.slice/crio-eda336b6212cde816ce0067d78004f64f1265fe25a9509c702846c3dd252be3f WatchSource:0}: Error finding container eda336b6212cde816ce0067d78004f64f1265fe25a9509c702846c3dd252be3f: Status 404 returned error can't find the container with id eda336b6212cde816ce0067d78004f64f1265fe25a9509c702846c3dd252be3f Apr 23 16:39:54.622554 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.622533 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:54.624916 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.624890 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ea6c6f6f-9978-422b-998b-b6d0bb182ba7-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8zkfl\" (UID: \"ea6c6f6f-9978-422b-998b-b6d0bb182ba7\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:54.753919 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.753836 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ffv29" event={"ID":"f8bde9bd-af78-417f-a38a-e6e93856d47d","Type":"ContainerStarted","Data":"eda336b6212cde816ce0067d78004f64f1265fe25a9509c702846c3dd252be3f"} Apr 23 16:39:54.758254 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.758238 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:54.880097 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:54.880077 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl"] Apr 23 16:39:54.882080 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:39:54.882052 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6c6f6f_9978_422b_998b_b6d0bb182ba7.slice/crio-4d0a1968d058bbe593f684c6ba3cf417cc482540e873eebe4b134e35f11a7e3a WatchSource:0}: Error finding container 4d0a1968d058bbe593f684c6ba3cf417cc482540e873eebe4b134e35f11a7e3a: Status 404 returned error can't find the container with id 4d0a1968d058bbe593f684c6ba3cf417cc482540e873eebe4b134e35f11a7e3a Apr 23 16:39:55.758806 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:55.758762 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" event={"ID":"ea6c6f6f-9978-422b-998b-b6d0bb182ba7","Type":"ContainerStarted","Data":"4d0a1968d058bbe593f684c6ba3cf417cc482540e873eebe4b134e35f11a7e3a"} Apr 23 16:39:58.768307 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:58.768281 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ffv29" event={"ID":"f8bde9bd-af78-417f-a38a-e6e93856d47d","Type":"ContainerStarted","Data":"64ef001af4e4f1f986f4595faa9ad8102a1aa2e848b134efeeec47341c2c5b2f"} Apr 23 16:39:58.768653 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:58.768524 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:39:58.769755 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:58.769733 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" event={"ID":"ea6c6f6f-9978-422b-998b-b6d0bb182ba7","Type":"ContainerStarted","Data":"49bf1fa7c9aab0ab56d78f78cb647b2f7c02b5a274e7ff6dac0387fa51e0f794"} Apr 23 16:39:58.769905 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:58.769874 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:39:58.787457 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:58.787414 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-ffv29" podStartSLOduration=4.660612039 podStartE2EDuration="8.787401264s" podCreationTimestamp="2026-04-23 16:39:50 +0000 UTC" firstStartedPulling="2026-04-23 16:39:54.580659127 +0000 UTC m=+268.312413552" lastFinishedPulling="2026-04-23 16:39:58.707448366 +0000 UTC m=+272.439202777" observedRunningTime="2026-04-23 16:39:58.785418448 +0000 UTC m=+272.517172911" watchObservedRunningTime="2026-04-23 16:39:58.787401264 +0000 UTC m=+272.519155698" Apr 23 16:39:58.805746 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:39:58.805689 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" podStartSLOduration=4.99061284 podStartE2EDuration="8.80567571s" podCreationTimestamp="2026-04-23 16:39:50 +0000 UTC" firstStartedPulling="2026-04-23 16:39:54.883431785 +0000 UTC m=+268.615186196" lastFinishedPulling="2026-04-23 16:39:58.69849465 +0000 UTC m=+272.430249066" observedRunningTime="2026-04-23 16:39:58.805440445 +0000 UTC m=+272.537194880" watchObservedRunningTime="2026-04-23 16:39:58.80567571 +0000 UTC m=+272.537430144" Apr 23 16:40:09.776681 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:09.776650 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8zkfl" Apr 23 16:40:11.745028 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:11.744997 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qfwzq" Apr 23 16:40:14.756035 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:14.756007 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-x652r" Apr 23 16:40:19.774939 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:19.774911 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-ffv29" Apr 23 16:40:26.766993 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:26.766967 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:40:26.767559 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:26.767538 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:40:26.770119 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:26.770102 2563 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:40:59.808746 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.808718 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-rjwr6"] Apr 23 16:40:59.811094 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.811080 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:40:59.817008 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.816985 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-v5rxh"] Apr 23 16:40:59.817254 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.817078 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:40:59.817492 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.817474 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-fzwgz\"" Apr 23 16:40:59.818439 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.818420 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:40:59.819294 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.819278 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:40:59.847446 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.847421 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-m2sch\"" Apr 23 16:40:59.847938 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.847924 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 16:40:59.853098 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.853079 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-rjwr6"] Apr 23 16:40:59.855362 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.855350 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 16:40:59.878811 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.878787 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-7whst"] Apr 23 16:40:59.880996 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.880980 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:40:59.885430 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.885411 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 16:40:59.900210 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.900190 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-v5rxh"] Apr 23 16:40:59.904463 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.904443 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8p6\" (UniqueName: \"kubernetes.io/projected/07341306-029c-4f82-8c30-b46c907ce7ac-kube-api-access-fx8p6\") pod \"llmisvc-controller-manager-6b94ff949c-rjwr6\" (UID: \"07341306-029c-4f82-8c30-b46c907ce7ac\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:40:59.904576 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.904488 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07341306-029c-4f82-8c30-b46c907ce7ac-cert\") pod \"llmisvc-controller-manager-6b94ff949c-rjwr6\" (UID: \"07341306-029c-4f82-8c30-b46c907ce7ac\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:40:59.904646 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.904569 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert\") pod \"kserve-controller-manager-5b898d7b9d-v5rxh\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:40:59.904646 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.904598 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt592\" (UniqueName: \"kubernetes.io/projected/b1733561-5659-4b34-b8ca-1def4211f7b3-kube-api-access-xt592\") pod \"kserve-controller-manager-5b898d7b9d-v5rxh\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:40:59.911708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.911689 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-5mftf\"" Apr 23 16:40:59.927528 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:40:59.925971 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7whst"] Apr 23 16:41:00.005649 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.005622 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07341306-029c-4f82-8c30-b46c907ce7ac-cert\") pod \"llmisvc-controller-manager-6b94ff949c-rjwr6\" (UID: \"07341306-029c-4f82-8c30-b46c907ce7ac\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:41:00.005764 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.005660 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert\") pod \"kserve-controller-manager-5b898d7b9d-v5rxh\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:00.005764 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.005690 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt592\" (UniqueName: \"kubernetes.io/projected/b1733561-5659-4b34-b8ca-1def4211f7b3-kube-api-access-xt592\") pod \"kserve-controller-manager-5b898d7b9d-v5rxh\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:00.005764 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.005720 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eef38706-0912-4929-ab16-545eb88d2cb8-data\") pod \"seaweedfs-86cc847c5c-7whst\" (UID: \"eef38706-0912-4929-ab16-545eb88d2cb8\") " pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:00.005885 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.005765 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7bwx\" (UniqueName: \"kubernetes.io/projected/eef38706-0912-4929-ab16-545eb88d2cb8-kube-api-access-r7bwx\") pod \"seaweedfs-86cc847c5c-7whst\" (UID: \"eef38706-0912-4929-ab16-545eb88d2cb8\") " pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:00.005885 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:41:00.005789 2563 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 23 16:41:00.005885 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.005802 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8p6\" (UniqueName: \"kubernetes.io/projected/07341306-029c-4f82-8c30-b46c907ce7ac-kube-api-access-fx8p6\") pod \"llmisvc-controller-manager-6b94ff949c-rjwr6\" (UID: \"07341306-029c-4f82-8c30-b46c907ce7ac\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:41:00.005885 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:41:00.005867 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert podName:b1733561-5659-4b34-b8ca-1def4211f7b3 nodeName:}" failed. No retries permitted until 2026-04-23 16:41:00.505847702 +0000 UTC m=+334.237602115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert") pod "kserve-controller-manager-5b898d7b9d-v5rxh" (UID: "b1733561-5659-4b34-b8ca-1def4211f7b3") : secret "kserve-webhook-server-cert" not found Apr 23 16:41:00.008022 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.008001 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07341306-029c-4f82-8c30-b46c907ce7ac-cert\") pod \"llmisvc-controller-manager-6b94ff949c-rjwr6\" (UID: \"07341306-029c-4f82-8c30-b46c907ce7ac\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:41:00.016995 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.016974 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt592\" (UniqueName: \"kubernetes.io/projected/b1733561-5659-4b34-b8ca-1def4211f7b3-kube-api-access-xt592\") pod \"kserve-controller-manager-5b898d7b9d-v5rxh\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:00.017900 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.017883 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8p6\" (UniqueName: \"kubernetes.io/projected/07341306-029c-4f82-8c30-b46c907ce7ac-kube-api-access-fx8p6\") pod \"llmisvc-controller-manager-6b94ff949c-rjwr6\" (UID: \"07341306-029c-4f82-8c30-b46c907ce7ac\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:41:00.106848 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.106791 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7bwx\" (UniqueName: \"kubernetes.io/projected/eef38706-0912-4929-ab16-545eb88d2cb8-kube-api-access-r7bwx\") pod \"seaweedfs-86cc847c5c-7whst\" (UID: \"eef38706-0912-4929-ab16-545eb88d2cb8\") " pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:00.106956 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.106940 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eef38706-0912-4929-ab16-545eb88d2cb8-data\") pod \"seaweedfs-86cc847c5c-7whst\" (UID: \"eef38706-0912-4929-ab16-545eb88d2cb8\") " pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:00.107222 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.107209 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eef38706-0912-4929-ab16-545eb88d2cb8-data\") pod \"seaweedfs-86cc847c5c-7whst\" (UID: \"eef38706-0912-4929-ab16-545eb88d2cb8\") " pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:00.117882 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.117865 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7bwx\" (UniqueName: \"kubernetes.io/projected/eef38706-0912-4929-ab16-545eb88d2cb8-kube-api-access-r7bwx\") pod \"seaweedfs-86cc847c5c-7whst\" (UID: \"eef38706-0912-4929-ab16-545eb88d2cb8\") " pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:00.121739 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.121704 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:41:00.189380 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.189354 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:00.247498 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.247471 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-rjwr6"] Apr 23 16:41:00.250109 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:41:00.250076 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod07341306_029c_4f82_8c30_b46c907ce7ac.slice/crio-a790b66e87ae00754d3bf9f1f178ed1bd8f13925a9ea339a185433aeda7fb069 WatchSource:0}: Error finding container a790b66e87ae00754d3bf9f1f178ed1bd8f13925a9ea339a185433aeda7fb069: Status 404 returned error can't find the container with id a790b66e87ae00754d3bf9f1f178ed1bd8f13925a9ea339a185433aeda7fb069 Apr 23 16:41:00.251601 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.251580 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:41:00.305653 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.305633 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7whst"] Apr 23 16:41:00.307293 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:41:00.307272 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeef38706_0912_4929_ab16_545eb88d2cb8.slice/crio-bc537cbadf6e684d6a60cd950a0bd29e5bb48faffbc9a94577dac0005c6a7096 WatchSource:0}: Error finding container bc537cbadf6e684d6a60cd950a0bd29e5bb48faffbc9a94577dac0005c6a7096: Status 404 returned error can't find the container with id bc537cbadf6e684d6a60cd950a0bd29e5bb48faffbc9a94577dac0005c6a7096 Apr 23 16:41:00.509978 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.509943 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert\") pod \"kserve-controller-manager-5b898d7b9d-v5rxh\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:00.512195 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.512176 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert\") pod \"kserve-controller-manager-5b898d7b9d-v5rxh\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:00.729393 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.729364 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:00.905447 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.905422 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-v5rxh"] Apr 23 16:41:00.907818 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:41:00.907790 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1733561_5659_4b34_b8ca_1def4211f7b3.slice/crio-da0aa8a56cd185567b5f90b58017cfc5e9a28d2aa34df7e4d15f028bf917738d WatchSource:0}: Error finding container da0aa8a56cd185567b5f90b58017cfc5e9a28d2aa34df7e4d15f028bf917738d: Status 404 returned error can't find the container with id da0aa8a56cd185567b5f90b58017cfc5e9a28d2aa34df7e4d15f028bf917738d Apr 23 16:41:00.934221 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.934178 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" event={"ID":"b1733561-5659-4b34-b8ca-1def4211f7b3","Type":"ContainerStarted","Data":"da0aa8a56cd185567b5f90b58017cfc5e9a28d2aa34df7e4d15f028bf917738d"} Apr 23 16:41:00.935377 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.935346 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7whst" event={"ID":"eef38706-0912-4929-ab16-545eb88d2cb8","Type":"ContainerStarted","Data":"bc537cbadf6e684d6a60cd950a0bd29e5bb48faffbc9a94577dac0005c6a7096"} Apr 23 16:41:00.936507 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:00.936476 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" event={"ID":"07341306-029c-4f82-8c30-b46c907ce7ac","Type":"ContainerStarted","Data":"a790b66e87ae00754d3bf9f1f178ed1bd8f13925a9ea339a185433aeda7fb069"} Apr 23 16:41:05.955369 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:05.955333 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" event={"ID":"b1733561-5659-4b34-b8ca-1def4211f7b3","Type":"ContainerStarted","Data":"291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422"} Apr 23 16:41:05.955790 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:05.955490 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:05.956708 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:05.956690 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7whst" event={"ID":"eef38706-0912-4929-ab16-545eb88d2cb8","Type":"ContainerStarted","Data":"02065251e6d89e9b90c50d3d8ac6076950f2fbe3dccb45a9599137a9c8674cf7"} Apr 23 16:41:05.956780 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:05.956752 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:05.957830 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:05.957810 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" event={"ID":"07341306-029c-4f82-8c30-b46c907ce7ac","Type":"ContainerStarted","Data":"0dde7f3c3c037f301ecebc10ec0528fe54c9537bebe86854e2339c2a293b1e27"} Apr 23 16:41:05.957952 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:05.957942 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:41:05.975476 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:05.975427 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" podStartSLOduration=2.614268116 podStartE2EDuration="6.975414884s" podCreationTimestamp="2026-04-23 16:40:59 +0000 UTC" firstStartedPulling="2026-04-23 16:41:00.909304633 +0000 UTC m=+334.641059059" lastFinishedPulling="2026-04-23 16:41:05.270451413 +0000 UTC m=+339.002205827" observedRunningTime="2026-04-23 16:41:05.975251487 +0000 UTC m=+339.707005911" watchObservedRunningTime="2026-04-23 16:41:05.975414884 +0000 UTC m=+339.707169316" Apr 23 16:41:05.994257 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:05.994205 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-7whst" podStartSLOduration=1.961272434 podStartE2EDuration="6.994194729s" podCreationTimestamp="2026-04-23 16:40:59 +0000 UTC" firstStartedPulling="2026-04-23 16:41:00.308575663 +0000 UTC m=+334.040330075" lastFinishedPulling="2026-04-23 16:41:05.341497956 +0000 UTC m=+339.073252370" observedRunningTime="2026-04-23 16:41:05.992695618 +0000 UTC m=+339.724450063" watchObservedRunningTime="2026-04-23 16:41:05.994194729 +0000 UTC m=+339.725949159" Apr 23 16:41:06.011168 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:06.011124 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" podStartSLOduration=1.957739517 podStartE2EDuration="7.011112081s" podCreationTimestamp="2026-04-23 16:40:59 +0000 UTC" firstStartedPulling="2026-04-23 16:41:00.251760997 +0000 UTC m=+333.983515418" lastFinishedPulling="2026-04-23 16:41:05.30513356 +0000 UTC m=+339.036887982" observedRunningTime="2026-04-23 16:41:06.009949326 +0000 UTC m=+339.741703759" watchObservedRunningTime="2026-04-23 16:41:06.011112081 +0000 UTC m=+339.742866557" Apr 23 16:41:11.963047 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:11.963017 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-7whst" Apr 23 16:41:36.962524 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:36.962498 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6b94ff949c-rjwr6" Apr 23 16:41:36.965545 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:36.965525 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:38.169562 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.169531 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-v5rxh"] Apr 23 16:41:38.169948 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.169737 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" podUID="b1733561-5659-4b34-b8ca-1def4211f7b3" containerName="manager" containerID="cri-o://291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422" gracePeriod=10 Apr 23 16:41:38.188735 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.188710 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-m8hq4"] Apr 23 16:41:38.247132 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.247108 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-m8hq4"] Apr 23 16:41:38.247252 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.247211 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:38.286796 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.286773 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77ac9d03-a6ea-42ce-80ed-03a398854281-cert\") pod \"kserve-controller-manager-5b898d7b9d-m8hq4\" (UID: \"77ac9d03-a6ea-42ce-80ed-03a398854281\") " pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:38.286894 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.286806 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tg7q\" (UniqueName: \"kubernetes.io/projected/77ac9d03-a6ea-42ce-80ed-03a398854281-kube-api-access-7tg7q\") pod \"kserve-controller-manager-5b898d7b9d-m8hq4\" (UID: \"77ac9d03-a6ea-42ce-80ed-03a398854281\") " pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:38.387758 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.387730 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77ac9d03-a6ea-42ce-80ed-03a398854281-cert\") pod \"kserve-controller-manager-5b898d7b9d-m8hq4\" (UID: \"77ac9d03-a6ea-42ce-80ed-03a398854281\") " pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:38.387905 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.387777 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tg7q\" (UniqueName: \"kubernetes.io/projected/77ac9d03-a6ea-42ce-80ed-03a398854281-kube-api-access-7tg7q\") pod \"kserve-controller-manager-5b898d7b9d-m8hq4\" (UID: \"77ac9d03-a6ea-42ce-80ed-03a398854281\") " pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:38.390239 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.390205 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77ac9d03-a6ea-42ce-80ed-03a398854281-cert\") pod \"kserve-controller-manager-5b898d7b9d-m8hq4\" (UID: \"77ac9d03-a6ea-42ce-80ed-03a398854281\") " pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:38.396430 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.396409 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tg7q\" (UniqueName: \"kubernetes.io/projected/77ac9d03-a6ea-42ce-80ed-03a398854281-kube-api-access-7tg7q\") pod \"kserve-controller-manager-5b898d7b9d-m8hq4\" (UID: \"77ac9d03-a6ea-42ce-80ed-03a398854281\") " pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:38.422471 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.422427 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:38.488695 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.488669 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt592\" (UniqueName: \"kubernetes.io/projected/b1733561-5659-4b34-b8ca-1def4211f7b3-kube-api-access-xt592\") pod \"b1733561-5659-4b34-b8ca-1def4211f7b3\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " Apr 23 16:41:38.488824 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.488712 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert\") pod \"b1733561-5659-4b34-b8ca-1def4211f7b3\" (UID: \"b1733561-5659-4b34-b8ca-1def4211f7b3\") " Apr 23 16:41:38.490616 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.490591 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert" (OuterVolumeSpecName: "cert") pod "b1733561-5659-4b34-b8ca-1def4211f7b3" (UID: "b1733561-5659-4b34-b8ca-1def4211f7b3"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:41:38.490699 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.490624 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1733561-5659-4b34-b8ca-1def4211f7b3-kube-api-access-xt592" (OuterVolumeSpecName: "kube-api-access-xt592") pod "b1733561-5659-4b34-b8ca-1def4211f7b3" (UID: "b1733561-5659-4b34-b8ca-1def4211f7b3"). InnerVolumeSpecName "kube-api-access-xt592". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:41:38.589730 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.589708 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xt592\" (UniqueName: \"kubernetes.io/projected/b1733561-5659-4b34-b8ca-1def4211f7b3-kube-api-access-xt592\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:41:38.589730 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.589727 2563 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1733561-5659-4b34-b8ca-1def4211f7b3-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:41:38.620207 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.620186 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:38.736305 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:38.736284 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-m8hq4"] Apr 23 16:41:38.738848 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:41:38.738821 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77ac9d03_a6ea_42ce_80ed_03a398854281.slice/crio-1b60d71a3e17fa94c324d423b264ac17b6d541f477798816001c71a44efdc9a4 WatchSource:0}: Error finding container 1b60d71a3e17fa94c324d423b264ac17b6d541f477798816001c71a44efdc9a4: Status 404 returned error can't find the container with id 1b60d71a3e17fa94c324d423b264ac17b6d541f477798816001c71a44efdc9a4 Apr 23 16:41:39.065414 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.065385 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" event={"ID":"77ac9d03-a6ea-42ce-80ed-03a398854281","Type":"ContainerStarted","Data":"1b60d71a3e17fa94c324d423b264ac17b6d541f477798816001c71a44efdc9a4"} Apr 23 16:41:39.066396 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.066366 2563 generic.go:358] "Generic (PLEG): container finished" podID="b1733561-5659-4b34-b8ca-1def4211f7b3" containerID="291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422" exitCode=0 Apr 23 16:41:39.066504 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.066430 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" Apr 23 16:41:39.066546 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.066427 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" event={"ID":"b1733561-5659-4b34-b8ca-1def4211f7b3","Type":"ContainerDied","Data":"291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422"} Apr 23 16:41:39.066588 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.066541 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-v5rxh" event={"ID":"b1733561-5659-4b34-b8ca-1def4211f7b3","Type":"ContainerDied","Data":"da0aa8a56cd185567b5f90b58017cfc5e9a28d2aa34df7e4d15f028bf917738d"} Apr 23 16:41:39.066588 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.066576 2563 scope.go:117] "RemoveContainer" containerID="291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422" Apr 23 16:41:39.073741 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.073724 2563 scope.go:117] "RemoveContainer" containerID="291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422" Apr 23 16:41:39.073989 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:41:39.073972 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422\": container with ID starting with 291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422 not found: ID does not exist" containerID="291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422" Apr 23 16:41:39.074040 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.074004 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422"} err="failed to get container status \"291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422\": rpc error: code = NotFound desc = could not find container \"291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422\": container with ID starting with 291479e84c45f3751e1c9db026284b23fe05b6d0177215d692c5b3a35c71d422 not found: ID does not exist" Apr 23 16:41:39.081765 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.081725 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-v5rxh"] Apr 23 16:41:39.083677 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:39.083656 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-v5rxh"] Apr 23 16:41:40.070350 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:40.070318 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" event={"ID":"77ac9d03-a6ea-42ce-80ed-03a398854281","Type":"ContainerStarted","Data":"0122d5ba0b68ebc90063eb84a48936ff4a2ac90d2c42273a37776be48727dca1"} Apr 23 16:41:40.070757 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:40.070360 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:41:40.087720 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:40.087666 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" podStartSLOduration=1.6394684160000002 podStartE2EDuration="2.087648668s" podCreationTimestamp="2026-04-23 16:41:38 +0000 UTC" firstStartedPulling="2026-04-23 16:41:38.740161506 +0000 UTC m=+372.471915917" lastFinishedPulling="2026-04-23 16:41:39.188341748 +0000 UTC m=+372.920096169" observedRunningTime="2026-04-23 16:41:40.086326141 +0000 UTC m=+373.818080575" watchObservedRunningTime="2026-04-23 16:41:40.087648668 +0000 UTC m=+373.819403102" Apr 23 16:41:40.879477 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:40.879449 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1733561-5659-4b34-b8ca-1def4211f7b3" path="/var/lib/kubelet/pods/b1733561-5659-4b34-b8ca-1def4211f7b3/volumes" Apr 23 16:41:46.053799 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:41:46.053762 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c585f9dfb-9njkf"] Apr 23 16:42:11.075930 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.075884 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c585f9dfb-9njkf" podUID="d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" containerName="console" containerID="cri-o://cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16" gracePeriod=15 Apr 23 16:42:11.079175 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.079154 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-5b898d7b9d-m8hq4" Apr 23 16:42:11.315387 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.315366 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c585f9dfb-9njkf_d07bb1a7-e0b3-47c3-abc3-1f64da558ca7/console/0.log" Apr 23 16:42:11.315512 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.315422 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:42:11.410197 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410124 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-config\") pod \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " Apr 23 16:42:11.410197 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410170 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-oauth-config\") pod \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " Apr 23 16:42:11.410197 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410187 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-serving-cert\") pod \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " Apr 23 16:42:11.410448 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410247 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-trusted-ca-bundle\") pod \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " Apr 23 16:42:11.410448 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410281 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfwmd\" (UniqueName: \"kubernetes.io/projected/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-kube-api-access-jfwmd\") pod \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " Apr 23 16:42:11.410448 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410322 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-oauth-serving-cert\") pod \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " Apr 23 16:42:11.410448 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410347 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-service-ca\") pod \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\" (UID: \"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7\") " Apr 23 16:42:11.410658 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410538 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-config" (OuterVolumeSpecName: "console-config") pod "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" (UID: "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:42:11.410764 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410741 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" (UID: "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:42:11.410851 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410823 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-service-ca" (OuterVolumeSpecName: "service-ca") pod "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" (UID: "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:42:11.410851 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.410843 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" (UID: "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:42:11.412551 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.412527 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" (UID: "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:42:11.412647 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.412548 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-kube-api-access-jfwmd" (OuterVolumeSpecName: "kube-api-access-jfwmd") pod "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" (UID: "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7"). InnerVolumeSpecName "kube-api-access-jfwmd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:42:11.412647 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.412554 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" (UID: "d07bb1a7-e0b3-47c3-abc3-1f64da558ca7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:42:11.510943 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.510922 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:42:11.510943 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.510942 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-oauth-config\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:42:11.511066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.510953 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-console-serving-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:42:11.511066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.510963 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-trusted-ca-bundle\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:42:11.511066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.510971 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jfwmd\" (UniqueName: \"kubernetes.io/projected/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-kube-api-access-jfwmd\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:42:11.511066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.510980 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-oauth-serving-cert\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:42:11.511066 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:11.510988 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7-service-ca\") on node \"ip-10-0-134-187.ec2.internal\" DevicePath \"\"" Apr 23 16:42:12.168318 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.168285 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c585f9dfb-9njkf_d07bb1a7-e0b3-47c3-abc3-1f64da558ca7/console/0.log" Apr 23 16:42:12.168703 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.168323 2563 generic.go:358] "Generic (PLEG): container finished" podID="d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" containerID="cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16" exitCode=2 Apr 23 16:42:12.168703 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.168360 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c585f9dfb-9njkf" event={"ID":"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7","Type":"ContainerDied","Data":"cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16"} Apr 23 16:42:12.168703 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.168407 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c585f9dfb-9njkf" event={"ID":"d07bb1a7-e0b3-47c3-abc3-1f64da558ca7","Type":"ContainerDied","Data":"e6a035e672e38c4497de9f16917fa9ec633a4511c64e993cd60f2164c3e7d958"} Apr 23 16:42:12.168703 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.168427 2563 scope.go:117] "RemoveContainer" containerID="cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16" Apr 23 16:42:12.168703 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.168429 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c585f9dfb-9njkf" Apr 23 16:42:12.177375 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.177352 2563 scope.go:117] "RemoveContainer" containerID="cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16" Apr 23 16:42:12.177630 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:42:12.177611 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16\": container with ID starting with cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16 not found: ID does not exist" containerID="cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16" Apr 23 16:42:12.177696 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.177643 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16"} err="failed to get container status \"cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16\": rpc error: code = NotFound desc = could not find container \"cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16\": container with ID starting with cf25362e71f1358e899960a452271d6213856e08ce1d579f3bf4bbeedd7c0d16 not found: ID does not exist" Apr 23 16:42:12.210908 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.210878 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-5vdhd"] Apr 23 16:42:12.211217 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.211203 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" containerName="console" Apr 23 16:42:12.211275 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.211220 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" containerName="console" Apr 23 16:42:12.211275 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.211242 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1733561-5659-4b34-b8ca-1def4211f7b3" containerName="manager" Apr 23 16:42:12.211275 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.211250 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1733561-5659-4b34-b8ca-1def4211f7b3" containerName="manager" Apr 23 16:42:12.211380 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.211317 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1733561-5659-4b34-b8ca-1def4211f7b3" containerName="manager" Apr 23 16:42:12.211380 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.211328 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" containerName="console" Apr 23 16:42:12.215499 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.215482 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:12.219815 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.219794 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 16:42:12.219945 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.219896 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-7spks\"" Apr 23 16:42:12.220638 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.220613 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c585f9dfb-9njkf"] Apr 23 16:42:12.225546 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.225526 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c585f9dfb-9njkf"] Apr 23 16:42:12.232942 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.232923 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5vdhd"] Apr 23 16:42:12.317947 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.317915 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq2ct\" (UniqueName: \"kubernetes.io/projected/9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf-kube-api-access-cq2ct\") pod \"odh-model-controller-696fc77849-5vdhd\" (UID: \"9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf\") " pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:12.318076 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.317960 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf-cert\") pod \"odh-model-controller-696fc77849-5vdhd\" (UID: \"9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf\") " pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:12.418427 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.418360 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf-cert\") pod \"odh-model-controller-696fc77849-5vdhd\" (UID: \"9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf\") " pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:12.418427 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.418424 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq2ct\" (UniqueName: \"kubernetes.io/projected/9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf-kube-api-access-cq2ct\") pod \"odh-model-controller-696fc77849-5vdhd\" (UID: \"9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf\") " pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:12.420554 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.420536 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf-cert\") pod \"odh-model-controller-696fc77849-5vdhd\" (UID: \"9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf\") " pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:12.427322 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.427297 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq2ct\" (UniqueName: \"kubernetes.io/projected/9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf-kube-api-access-cq2ct\") pod \"odh-model-controller-696fc77849-5vdhd\" (UID: \"9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf\") " pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:12.526401 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.526377 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:12.643901 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.643878 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5vdhd"] Apr 23 16:42:12.645875 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:42:12.645844 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bdf9eb5_c37b_44dc_aada_7afaa7bacfcf.slice/crio-6fcfe5972518882c7047c8522b548b4bb28f774c6b41772971731ccfe0e9ab29 WatchSource:0}: Error finding container 6fcfe5972518882c7047c8522b548b4bb28f774c6b41772971731ccfe0e9ab29: Status 404 returned error can't find the container with id 6fcfe5972518882c7047c8522b548b4bb28f774c6b41772971731ccfe0e9ab29 Apr 23 16:42:12.879010 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:12.878978 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d07bb1a7-e0b3-47c3-abc3-1f64da558ca7" path="/var/lib/kubelet/pods/d07bb1a7-e0b3-47c3-abc3-1f64da558ca7/volumes" Apr 23 16:42:13.172388 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:13.172312 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5vdhd" event={"ID":"9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf","Type":"ContainerStarted","Data":"6fcfe5972518882c7047c8522b548b4bb28f774c6b41772971731ccfe0e9ab29"} Apr 23 16:42:16.184523 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:16.184490 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5vdhd" event={"ID":"9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf","Type":"ContainerStarted","Data":"ec5f1d11a61d89087b944ce1f60048e347827a4f789d4dad2ff66e67d5ed863a"} Apr 23 16:42:16.184895 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:16.184595 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:16.209549 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:16.209501 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-5vdhd" podStartSLOduration=1.626532821 podStartE2EDuration="4.209485321s" podCreationTimestamp="2026-04-23 16:42:12 +0000 UTC" firstStartedPulling="2026-04-23 16:42:12.647147968 +0000 UTC m=+406.378902382" lastFinishedPulling="2026-04-23 16:42:15.230100454 +0000 UTC m=+408.961854882" observedRunningTime="2026-04-23 16:42:16.2078348 +0000 UTC m=+409.939589256" watchObservedRunningTime="2026-04-23 16:42:16.209485321 +0000 UTC m=+409.941239757" Apr 23 16:42:27.192607 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:27.192537 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-5vdhd" Apr 23 16:42:46.272903 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.272820 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c"] Apr 23 16:42:46.276054 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.276033 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.279617 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.279595 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 16:42:46.280915 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.280893 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 23 16:42:46.286160 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.286139 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c"] Apr 23 16:42:46.356997 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.356968 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjst\" (UniqueName: \"kubernetes.io/projected/1f6534cf-b9cf-44a5-9b3b-e421e7721237-kube-api-access-gfjst\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.357143 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.357000 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1f6534cf-b9cf-44a5-9b3b-e421e7721237-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.357143 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.357018 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1f6534cf-b9cf-44a5-9b3b-e421e7721237-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.458397 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.458367 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjst\" (UniqueName: \"kubernetes.io/projected/1f6534cf-b9cf-44a5-9b3b-e421e7721237-kube-api-access-gfjst\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.458513 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.458412 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1f6534cf-b9cf-44a5-9b3b-e421e7721237-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.458513 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.458440 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1f6534cf-b9cf-44a5-9b3b-e421e7721237-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.458788 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.458765 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1f6534cf-b9cf-44a5-9b3b-e421e7721237-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.460814 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.460797 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/1f6534cf-b9cf-44a5-9b3b-e421e7721237-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.467314 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.467297 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfjst\" (UniqueName: \"kubernetes.io/projected/1f6534cf-b9cf-44a5-9b3b-e421e7721237-kube-api-access-gfjst\") pod \"seaweedfs-tls-custom-5c88b85bb7-2zm9c\" (UID: \"1f6534cf-b9cf-44a5-9b3b-e421e7721237\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.585412 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.585333 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" Apr 23 16:42:46.700287 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:46.700256 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c"] Apr 23 16:42:46.702522 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:42:46.702499 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f6534cf_b9cf_44a5_9b3b_e421e7721237.slice/crio-f6e626f56e1973c73b28e65db278cec53d32981604d11da4e821319bc1d23210 WatchSource:0}: Error finding container f6e626f56e1973c73b28e65db278cec53d32981604d11da4e821319bc1d23210: Status 404 returned error can't find the container with id f6e626f56e1973c73b28e65db278cec53d32981604d11da4e821319bc1d23210 Apr 23 16:42:47.286942 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:47.286912 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" event={"ID":"1f6534cf-b9cf-44a5-9b3b-e421e7721237","Type":"ContainerStarted","Data":"f0d7ce7dfb3284fbda679134ba43b032d4c6d320b15578db35a71ce4b3f95a47"} Apr 23 16:42:47.286942 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:47.286947 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" event={"ID":"1f6534cf-b9cf-44a5-9b3b-e421e7721237","Type":"ContainerStarted","Data":"f6e626f56e1973c73b28e65db278cec53d32981604d11da4e821319bc1d23210"} Apr 23 16:42:47.305827 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:42:47.305788 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2zm9c" podStartSLOduration=1.034459281 podStartE2EDuration="1.305774595s" podCreationTimestamp="2026-04-23 16:42:46 +0000 UTC" firstStartedPulling="2026-04-23 16:42:46.703768614 +0000 UTC m=+440.435523024" lastFinishedPulling="2026-04-23 16:42:46.975083927 +0000 UTC m=+440.706838338" observedRunningTime="2026-04-23 16:42:47.303449275 +0000 UTC m=+441.035203707" watchObservedRunningTime="2026-04-23 16:42:47.305774595 +0000 UTC m=+441.037529027" Apr 23 16:45:26.795208 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:45:26.795175 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:45:26.797447 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:45:26.797426 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:46:26.107360 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:26.107331 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg"] Apr 23 16:46:26.110365 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:26.110347 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" Apr 23 16:46:26.112906 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:26.112887 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rtrj5\"" Apr 23 16:46:26.116554 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:26.116532 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg"] Apr 23 16:46:26.120054 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:26.120035 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" Apr 23 16:46:26.237920 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:26.237899 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg"] Apr 23 16:46:26.240067 ip-10-0-134-187 kubenswrapper[2563]: W0423 16:46:26.240033 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7251a60b_d9fb_406f_8853_52bdd357600b.slice/crio-df682155b96f355de3266cba9c2e5f285556ce8d3526e61a349b1dfc5c1a63e9 WatchSource:0}: Error finding container df682155b96f355de3266cba9c2e5f285556ce8d3526e61a349b1dfc5c1a63e9: Status 404 returned error can't find the container with id df682155b96f355de3266cba9c2e5f285556ce8d3526e61a349b1dfc5c1a63e9 Apr 23 16:46:26.241996 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:26.241978 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:46:26.957010 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:26.956970 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" event={"ID":"7251a60b-d9fb-406f-8853-52bdd357600b","Type":"ContainerStarted","Data":"df682155b96f355de3266cba9c2e5f285556ce8d3526e61a349b1dfc5c1a63e9"} Apr 23 16:46:27.961216 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:27.961143 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" event={"ID":"7251a60b-d9fb-406f-8853-52bdd357600b","Type":"ContainerStarted","Data":"9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f"} Apr 23 16:46:27.961582 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:27.961305 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" Apr 23 16:46:27.962869 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:27.962848 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" Apr 23 16:46:27.976581 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:46:27.976542 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" podStartSLOduration=0.97457473 podStartE2EDuration="1.976529954s" podCreationTimestamp="2026-04-23 16:46:26 +0000 UTC" firstStartedPulling="2026-04-23 16:46:26.242109559 +0000 UTC m=+659.973863968" lastFinishedPulling="2026-04-23 16:46:27.244064767 +0000 UTC m=+660.975819192" observedRunningTime="2026-04-23 16:46:27.975694898 +0000 UTC m=+661.707449331" watchObservedRunningTime="2026-04-23 16:46:27.976529954 +0000 UTC m=+661.708284385" Apr 23 16:48:01.230494 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:01.230463 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-mmpxg_7251a60b-d9fb-406f-8853-52bdd357600b/kserve-container/0.log" Apr 23 16:48:01.519385 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:01.519309 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg"] Apr 23 16:48:01.519539 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:01.519519 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" podUID="7251a60b-d9fb-406f-8853-52bdd357600b" containerName="kserve-container" containerID="cri-o://9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f" gracePeriod=30 Apr 23 16:48:01.757739 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:01.757718 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" Apr 23 16:48:02.257339 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.257299 2563 generic.go:358] "Generic (PLEG): container finished" podID="7251a60b-d9fb-406f-8853-52bdd357600b" containerID="9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f" exitCode=2 Apr 23 16:48:02.257696 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.257388 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" event={"ID":"7251a60b-d9fb-406f-8853-52bdd357600b","Type":"ContainerDied","Data":"9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f"} Apr 23 16:48:02.257696 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.257428 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" event={"ID":"7251a60b-d9fb-406f-8853-52bdd357600b","Type":"ContainerDied","Data":"df682155b96f355de3266cba9c2e5f285556ce8d3526e61a349b1dfc5c1a63e9"} Apr 23 16:48:02.257696 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.257444 2563 scope.go:117] "RemoveContainer" containerID="9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f" Apr 23 16:48:02.257696 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.257399 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg" Apr 23 16:48:02.267487 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.267472 2563 scope.go:117] "RemoveContainer" containerID="9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f" Apr 23 16:48:02.267714 ip-10-0-134-187 kubenswrapper[2563]: E0423 16:48:02.267695 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f\": container with ID starting with 9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f not found: ID does not exist" containerID="9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f" Apr 23 16:48:02.267770 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.267718 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f"} err="failed to get container status \"9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f\": rpc error: code = NotFound desc = could not find container \"9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f\": container with ID starting with 9a58856c28d1937df394bb634f7219a854498068fb9b975546bac4e0d2b7440f not found: ID does not exist" Apr 23 16:48:02.280832 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.280803 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg"] Apr 23 16:48:02.284218 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.284198 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-mmpxg"] Apr 23 16:48:02.879968 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:48:02.879931 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7251a60b-d9fb-406f-8853-52bdd357600b" path="/var/lib/kubelet/pods/7251a60b-d9fb-406f-8853-52bdd357600b/volumes" Apr 23 16:50:26.818985 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:50:26.818902 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:50:26.821844 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:50:26.821823 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:55:26.840196 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:55:26.840164 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 16:55:26.844521 ip-10-0-134-187 kubenswrapper[2563]: I0423 16:55:26.844501 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:00:26.861347 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:00:26.861314 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:00:26.865885 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:00:26.865863 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:05:26.887994 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:05:26.887961 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:05:26.892721 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:05:26.892699 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:10:26.908770 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:10:26.908738 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:10:26.915488 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:10:26.915471 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:15:26.929102 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:15:26.928997 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:15:26.936622 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:15:26.936596 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:20:26.949880 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:20:26.949765 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:20:26.958522 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:20:26.958501 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:25:26.971882 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:25:26.971759 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:25:26.980394 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:25:26.980369 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:30:26.991878 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:30:26.991851 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:30:27.001898 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:30:27.001876 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:35:27.012200 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:35:27.012088 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:35:27.024341 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:35:27.024320 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:37:36.104349 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:36.104281 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jkzhw_d0e44b1d-5f76-4a34-9664-f73402be5e2d/global-pull-secret-syncer/0.log" Apr 23 17:37:36.314176 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:36.314146 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-khsql_765656a2-d2b1-490f-b1db-11ff6b259036/konnectivity-agent/0.log" Apr 23 17:37:36.363785 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:36.363702 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-187.ec2.internal_9f2167486c68501ab6cd4222066784e7/haproxy/0.log" Apr 23 17:37:39.952675 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:39.952641 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_60c8e366-8801-431c-8939-d761d6b95fcc/alertmanager/0.log" Apr 23 17:37:39.991587 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:39.991554 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_60c8e366-8801-431c-8939-d761d6b95fcc/config-reloader/0.log" Apr 23 17:37:40.042372 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.042338 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_60c8e366-8801-431c-8939-d761d6b95fcc/kube-rbac-proxy-web/0.log" Apr 23 17:37:40.077354 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.077309 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_60c8e366-8801-431c-8939-d761d6b95fcc/kube-rbac-proxy/0.log" Apr 23 17:37:40.120311 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.120261 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_60c8e366-8801-431c-8939-d761d6b95fcc/kube-rbac-proxy-metric/0.log" Apr 23 17:37:40.161075 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.161053 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_60c8e366-8801-431c-8939-d761d6b95fcc/prom-label-proxy/0.log" Apr 23 17:37:40.202189 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.202141 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_60c8e366-8801-431c-8939-d761d6b95fcc/init-config-reloader/0.log" Apr 23 17:37:40.299552 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.299516 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m2ztp_568bdcdb-d09b-4f63-8775-e55efec84c8e/kube-state-metrics/0.log" Apr 23 17:37:40.338330 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.338285 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m2ztp_568bdcdb-d09b-4f63-8775-e55efec84c8e/kube-rbac-proxy-main/0.log" Apr 23 17:37:40.377352 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.377325 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m2ztp_568bdcdb-d09b-4f63-8775-e55efec84c8e/kube-rbac-proxy-self/0.log" Apr 23 17:37:40.661597 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.661511 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtq25_ffcd3636-7eaf-487e-b56b-788842ea51bb/node-exporter/0.log" Apr 23 17:37:40.700818 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.700785 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtq25_ffcd3636-7eaf-487e-b56b-788842ea51bb/kube-rbac-proxy/0.log" Apr 23 17:37:40.738061 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.738034 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtq25_ffcd3636-7eaf-487e-b56b-788842ea51bb/init-textfile/0.log" Apr 23 17:37:40.781209 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.781178 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-v8g6g_607bdd08-a8f9-4d7e-8f45-913d893a9763/kube-rbac-proxy-main/0.log" Apr 23 17:37:40.823281 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.823250 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-v8g6g_607bdd08-a8f9-4d7e-8f45-913d893a9763/kube-rbac-proxy-self/0.log" Apr 23 17:37:40.874532 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:40.874494 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-v8g6g_607bdd08-a8f9-4d7e-8f45-913d893a9763/openshift-state-metrics/0.log" Apr 23 17:37:43.151500 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.151442 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf"] Apr 23 17:37:43.151977 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.151852 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7251a60b-d9fb-406f-8853-52bdd357600b" containerName="kserve-container" Apr 23 17:37:43.151977 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.151867 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="7251a60b-d9fb-406f-8853-52bdd357600b" containerName="kserve-container" Apr 23 17:37:43.151977 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.151931 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="7251a60b-d9fb-406f-8853-52bdd357600b" containerName="kserve-container" Apr 23 17:37:43.154843 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.154826 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.157485 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.157464 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lzvhv\"/\"openshift-service-ca.crt\"" Apr 23 17:37:43.158607 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.158591 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lzvhv\"/\"default-dockercfg-xx7s6\"" Apr 23 17:37:43.158701 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.158608 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lzvhv\"/\"kube-root-ca.crt\"" Apr 23 17:37:43.163673 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.163651 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf"] Apr 23 17:37:43.317807 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.317773 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-lib-modules\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.318013 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.317826 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-podres\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.318013 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.317845 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-proc\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.318013 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.317864 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4b5r\" (UniqueName: \"kubernetes.io/projected/1a0338e2-aca4-4428-8b3f-a66da7a41bab-kube-api-access-q4b5r\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.318013 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.317954 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-sys\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419181 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419090 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-sys\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419356 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419211 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-sys\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419356 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419258 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-lib-modules\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419356 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419312 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-podres\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419356 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419331 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-proc\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419356 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419356 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4b5r\" (UniqueName: \"kubernetes.io/projected/1a0338e2-aca4-4428-8b3f-a66da7a41bab-kube-api-access-q4b5r\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419565 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419396 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-podres\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419565 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419422 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-lib-modules\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.419565 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.419443 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a0338e2-aca4-4428-8b3f-a66da7a41bab-proc\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.427552 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.427529 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4b5r\" (UniqueName: \"kubernetes.io/projected/1a0338e2-aca4-4428-8b3f-a66da7a41bab-kube-api-access-q4b5r\") pod \"perf-node-gather-daemonset-qjgzf\" (UID: \"1a0338e2-aca4-4428-8b3f-a66da7a41bab\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.465454 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.465423 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.585415 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.585389 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf"] Apr 23 17:37:43.587839 ip-10-0-134-187 kubenswrapper[2563]: W0423 17:37:43.587812 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1a0338e2_aca4_4428_8b3f_a66da7a41bab.slice/crio-39ed77f872712aac84ae4eff4fc40436f3fa3a7257edeb8fd1c3ff8129a3463c WatchSource:0}: Error finding container 39ed77f872712aac84ae4eff4fc40436f3fa3a7257edeb8fd1c3ff8129a3463c: Status 404 returned error can't find the container with id 39ed77f872712aac84ae4eff4fc40436f3fa3a7257edeb8fd1c3ff8129a3463c Apr 23 17:37:43.589537 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.589520 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:37:43.819730 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.819691 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" event={"ID":"1a0338e2-aca4-4428-8b3f-a66da7a41bab","Type":"ContainerStarted","Data":"8fe1a6de21130446b69e7dd5c7099b7aabc17694a5e365b680fb6deef99aebe3"} Apr 23 17:37:43.819887 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.819736 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" event={"ID":"1a0338e2-aca4-4428-8b3f-a66da7a41bab","Type":"ContainerStarted","Data":"39ed77f872712aac84ae4eff4fc40436f3fa3a7257edeb8fd1c3ff8129a3463c"} Apr 23 17:37:43.819887 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.819768 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:43.838677 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:43.838617 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" podStartSLOduration=0.838601188 podStartE2EDuration="838.601188ms" podCreationTimestamp="2026-04-23 17:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:37:43.837390516 +0000 UTC m=+3737.569144946" watchObservedRunningTime="2026-04-23 17:37:43.838601188 +0000 UTC m=+3737.570355620" Apr 23 17:37:44.494397 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:44.494364 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-shfp9_75c16a06-c713-441e-ba98-548b432943dd/dns/0.log" Apr 23 17:37:44.519430 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:44.519400 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-shfp9_75c16a06-c713-441e-ba98-548b432943dd/kube-rbac-proxy/0.log" Apr 23 17:37:44.656099 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:44.656064 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w2bxv_ffb20d28-4839-4bfe-aa6f-83380eb3d9be/dns-node-resolver/0.log" Apr 23 17:37:45.085516 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:45.085476 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-66b6dbc54f-dlrcz_f2764ea6-6412-43c9-9d81-1d51c9d17fe1/registry/0.log" Apr 23 17:37:45.115805 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:45.115773 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9896x_455d942e-c133-4e0e-9c9c-c8f16c4d5e30/node-ca/0.log" Apr 23 17:37:46.340485 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:46.340453 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ckfp9_56ca6981-847e-4fce-bb14-e0fa6f8fb697/serve-healthcheck-canary/0.log" Apr 23 17:37:46.841178 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:46.841154 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f2c8s_a2d74b77-3933-4c89-8d3d-f2c3486cfb85/kube-rbac-proxy/0.log" Apr 23 17:37:46.870837 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:46.870815 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f2c8s_a2d74b77-3933-4c89-8d3d-f2c3486cfb85/exporter/0.log" Apr 23 17:37:46.895136 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:46.895105 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f2c8s_a2d74b77-3933-4c89-8d3d-f2c3486cfb85/extractor/0.log" Apr 23 17:37:49.067048 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:49.067017 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-5b898d7b9d-m8hq4_77ac9d03-a6ea-42ce-80ed-03a398854281/manager/0.log" Apr 23 17:37:49.091513 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:49.091482 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6b94ff949c-rjwr6_07341306-029c-4f82-8c30-b46c907ce7ac/manager/0.log" Apr 23 17:37:49.487181 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:49.487152 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-5vdhd_9bdf9eb5-c37b-44dc-aada-7afaa7bacfcf/manager/0.log" Apr 23 17:37:49.599616 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:49.599582 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-7whst_eef38706-0912-4929-ab16-545eb88d2cb8/seaweedfs/0.log" Apr 23 17:37:49.626203 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:49.626168 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-2zm9c_1f6534cf-b9cf-44a5-9b3b-e421e7721237/seaweedfs-tls-custom/0.log" Apr 23 17:37:49.832819 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:49.832747 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-qjgzf" Apr 23 17:37:55.260462 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.260433 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4k96t_9db0f82d-4208-44c8-a818-ed7fcbd374fa/kube-multus-additional-cni-plugins/0.log" Apr 23 17:37:55.282288 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.282262 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4k96t_9db0f82d-4208-44c8-a818-ed7fcbd374fa/egress-router-binary-copy/0.log" Apr 23 17:37:55.305416 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.305387 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4k96t_9db0f82d-4208-44c8-a818-ed7fcbd374fa/cni-plugins/0.log" Apr 23 17:37:55.331121 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.331096 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4k96t_9db0f82d-4208-44c8-a818-ed7fcbd374fa/bond-cni-plugin/0.log" Apr 23 17:37:55.353537 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.353517 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4k96t_9db0f82d-4208-44c8-a818-ed7fcbd374fa/routeoverride-cni/0.log" Apr 23 17:37:55.376457 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.376432 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4k96t_9db0f82d-4208-44c8-a818-ed7fcbd374fa/whereabouts-cni-bincopy/0.log" Apr 23 17:37:55.399476 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.399449 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4k96t_9db0f82d-4208-44c8-a818-ed7fcbd374fa/whereabouts-cni/0.log" Apr 23 17:37:55.829690 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.829638 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zghhl_47257c1b-a9bb-4228-abc5-2ba95fa73db4/kube-multus/0.log" Apr 23 17:37:55.945889 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.945861 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kpgxm_c1dab98e-8f79-4056-94f4-9185da61ca34/network-metrics-daemon/0.log" Apr 23 17:37:55.968999 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:55.968971 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kpgxm_c1dab98e-8f79-4056-94f4-9185da61ca34/kube-rbac-proxy/0.log" Apr 23 17:37:57.156067 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.156028 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-controller/0.log" Apr 23 17:37:57.176161 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.176132 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/0.log" Apr 23 17:37:57.210177 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.210143 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovn-acl-logging/1.log" Apr 23 17:37:57.233774 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.233744 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/kube-rbac-proxy-node/0.log" Apr 23 17:37:57.260060 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.260034 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:37:57.283645 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.283621 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/northd/0.log" Apr 23 17:37:57.315347 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.315315 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/nbdb/0.log" Apr 23 17:37:57.341117 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.341080 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/sbdb/0.log" Apr 23 17:37:57.514502 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:57.514476 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc9pq_2f90e3aa-3501-4d70-8aed-0b0959ac4c07/ovnkube-controller/0.log" Apr 23 17:37:59.069682 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:37:59.069644 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-pz92q_92efbb3d-8bd0-413e-b306-331d80df0505/network-check-target-container/0.log" Apr 23 17:38:00.106242 ip-10-0-134-187 kubenswrapper[2563]: I0423 17:38:00.106195 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qbf55_51af9790-dfdc-4e37-824c-072fa2141017/iptables-alerter/0.log"