Apr 24 21:28:57.937649 ip-10-0-135-27 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:28:58.302811 ip-10-0-135-27 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:58.302811 ip-10-0-135-27 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:28:58.302811 ip-10-0-135-27 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:58.302811 ip-10-0-135-27 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:28:58.302811 ip-10-0-135-27 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:58.304860 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.304775 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:28:58.307630 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307616 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:58.307630 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307630 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307634 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307637 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307640 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307643 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307646 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307648 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307651 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307654 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307657 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307659 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307662 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307664 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307667 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307679 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307682 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307686 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307689 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307691 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:58.307693 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307694 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307697 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307701 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307704 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307708 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307712 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307715 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307718 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307720 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307723 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307726 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307728 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307731 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307734 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307736 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307739 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307742 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307744 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307747 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307750 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:58.308170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307752 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307755 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307757 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307760 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307762 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307765 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307767 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307770 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307772 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307775 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307777 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307780 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307782 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307786 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307789 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307792 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307795 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307797 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307800 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307803 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:58.308677 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307806 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307808 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307811 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307814 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307816 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307819 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307822 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307825 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307830 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307833 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307836 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307839 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307842 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307845 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307848 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307851 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307854 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307856 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307859 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307861 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:58.309170 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307864 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307866 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307869 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307872 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307874 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.307877 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308231 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308235 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308238 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308241 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308244 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308247 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308250 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308252 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308255 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308257 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308260 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308263 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308265 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308268 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:58.309681 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308270 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308273 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308275 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308278 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308281 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308283 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308286 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308288 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308291 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308293 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308296 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308299 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308301 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308304 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308306 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308309 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308311 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308314 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308317 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308320 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:58.310186 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308323 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308326 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308328 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308331 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308334 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308336 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308338 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308341 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308344 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308360 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308363 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308365 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308368 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308371 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308373 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308377 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308380 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308383 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308385 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308388 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:58.310720 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308391 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308393 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308396 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308398 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308402 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308405 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308408 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308411 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308413 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308416 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308419 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308424 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308427 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308430 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308433 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308436 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308438 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308441 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308444 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:58.311203 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308446 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308449 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308452 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308455 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308457 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308460 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308463 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308465 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308468 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308472 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308475 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308478 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.308480 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309250 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309258 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309265 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309270 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309274 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309278 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309282 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309286 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:28:58.311700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309289 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309292 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309296 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309300 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309303 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309306 2566 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309309 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309312 2566 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309315 2566 flags.go:64] FLAG: --cloud-config="" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309318 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309321 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309325 2566 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309327 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309331 2566 flags.go:64] FLAG: --config-dir="" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309333 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309337 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309341 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309344 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309360 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309364 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309367 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309371 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309374 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309377 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309381 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:28:58.312210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309385 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309388 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309391 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309394 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309397 2566 flags.go:64] FLAG: --enable-server="true" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309400 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309405 2566 flags.go:64] FLAG: --event-burst="100" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309408 2566 flags.go:64] FLAG: --event-qps="50" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309412 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309415 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309418 2566 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309422 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309427 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309430 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309433 2566 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309436 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309439 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309442 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309445 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309448 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309451 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309454 2566 flags.go:64] FLAG: --feature-gates="" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309457 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309460 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309464 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:28:58.312831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309467 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309471 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309474 2566 flags.go:64] FLAG: --help="false" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309477 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309480 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309483 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309486 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309489 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309493 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309496 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309499 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309502 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309505 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309508 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309511 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309514 2566 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309517 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309520 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309523 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309526 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309529 2566 flags.go:64] FLAG: --lock-file="" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309532 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309535 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309538 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:28:58.313483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309543 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309546 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309549 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309552 2566 flags.go:64] FLAG: --logging-format="text" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309555 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309558 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309561 2566 flags.go:64] FLAG: --manifest-url="" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309569 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309574 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309577 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309581 2566 flags.go:64] FLAG: --max-pods="110" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309584 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309587 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309590 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309593 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309596 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309599 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309602 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309610 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309614 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309617 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309622 2566 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309625 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:28:58.314076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309630 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309633 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309636 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309639 2566 flags.go:64] FLAG: --port="10250" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309642 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309645 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0792fa76877353769" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309648 2566 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309651 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309654 2566 flags.go:64] FLAG: --register-node="true" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309657 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309660 2566 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309669 2566 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309672 2566 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309675 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309677 2566 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309681 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309686 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309689 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309693 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309696 2566 flags.go:64] FLAG: --runonce="false" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309699 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309702 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309705 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309708 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309711 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309714 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:28:58.314638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309717 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309720 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309723 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309726 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309729 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309732 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309735 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309738 2566 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309741 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309747 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309750 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309752 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309756 2566 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309759 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309762 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309764 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309767 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309770 2566 flags.go:64] FLAG: --v="2" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309777 2566 flags.go:64] FLAG: --version="false" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309781 2566 flags.go:64] FLAG: --vmodule="" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309785 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.309788 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309878 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309881 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:58.315408 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309884 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309887 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309890 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309892 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309895 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309897 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309900 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309903 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309905 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309908 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309910 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309913 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309916 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309919 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309921 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309924 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309927 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309929 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309932 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309934 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:58.316001 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309937 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309939 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309942 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309944 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309947 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309949 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309952 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309954 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309965 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309968 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309972 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309975 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309977 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309980 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309983 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309985 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309988 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309991 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309993 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309996 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:58.316556 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.309999 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310002 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310005 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310007 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310010 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310014 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310018 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310022 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310024 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310027 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310030 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310033 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310035 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310038 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310040 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310043 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310045 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310048 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310050 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:58.317045 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310052 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310055 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310058 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310061 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310064 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310067 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310069 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310072 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310075 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310077 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310080 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310082 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310085 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310087 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310091 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310093 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310096 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310099 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310103 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310105 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:58.317772 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310108 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310110 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310113 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310115 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.310118 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.310891 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.317130 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.317149 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317362 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317371 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317377 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317381 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317386 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317397 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317402 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:58.318543 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317406 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317410 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317414 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317419 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317423 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317427 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317432 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317436 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317440 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317445 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317454 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317460 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317466 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317472 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317476 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317480 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317485 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317489 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317493 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:58.318922 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317497 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317504 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317510 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317520 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317525 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317531 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317536 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317540 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317544 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317549 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317553 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317557 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317562 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317566 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317570 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317575 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317584 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317588 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317593 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:58.319424 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317597 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317601 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317606 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317610 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317614 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317619 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317630 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317635 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317639 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317643 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317652 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317656 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317661 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317665 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317669 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317674 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317678 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317683 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317688 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317693 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:58.319889 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317697 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317701 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317711 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317715 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317720 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317724 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317728 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317732 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317736 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317741 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317745 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317749 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317753 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317757 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317761 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317770 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317774 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317779 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317783 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317788 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:58.320430 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.317792 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.317801 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318300 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318313 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318317 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318320 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318323 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318325 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318329 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318332 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318335 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318337 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318340 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318343 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318363 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:58.320923 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318369 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318372 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318375 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318378 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318382 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318384 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318387 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318389 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318392 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318394 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318397 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318400 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318402 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318404 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318407 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318410 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318412 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318415 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318418 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318422 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:58.321280 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318426 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318430 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318433 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318435 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318438 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318441 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318444 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318446 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318449 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318452 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318454 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318457 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318459 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318462 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318464 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318467 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318469 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318472 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318474 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318477 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:58.321855 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318479 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318482 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318485 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318487 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318490 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318492 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318495 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318498 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318501 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318503 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318506 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318509 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318512 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318515 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318517 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318520 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318523 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318525 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318528 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:58.322425 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318531 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318533 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318536 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318538 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318541 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318544 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318546 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318549 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318552 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318555 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318557 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318560 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318563 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:58.318566 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.318571 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:58.322897 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.319223 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:28:58.323270 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.321278 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:28:58.323270 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.322100 2566 server.go:1019] "Starting client certificate rotation" Apr 24 21:28:58.323270 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.322195 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:28:58.323270 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.322928 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:28:58.344028 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.344008 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:28:58.346263 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.346243 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:28:58.361326 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.361305 2566 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:28:58.366281 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.366264 2566 log.go:25] "Validated CRI v1 image API" Apr 24 21:28:58.368075 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.368061 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:28:58.371332 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.371315 2566 fs.go:135] Filesystem UUIDs: map[0cc3ada7-15be-42e0-ad90-594a0f62fed4:/dev/nvme0n1p4 6da42673-e3a6-4775-8d3e-19e05f766396:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 24 21:28:58.371407 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.371333 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:28:58.372448 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.372428 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:28:58.377849 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.377744 2566 manager.go:217] Machine: {Timestamp:2026-04-24 21:28:58.37608868 +0000 UTC m=+0.337853691 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107786 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28a42eeb75a938723c4844a3a6a3ab SystemUUID:ec28a42e-eb75-a938-723c-4844a3a6a3ab BootID:25ec2b6a-c1bf-4e78-9c25-880987024a86 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a9:f5:44:8d:7d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a9:f5:44:8d:7d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:1d:bd:b0:eb:16 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:28:58.377849 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.377844 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:28:58.377999 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.377916 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:28:58.378296 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.378275 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:28:58.378454 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.378297 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-27.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:28:58.378497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.378460 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:28:58.378497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.378468 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:28:58.378497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.378481 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:28:58.379316 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.379297 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:28:58.380269 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.380260 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:28:58.380391 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.380373 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:28:58.382539 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.382530 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:28:58.382578 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.382547 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:28:58.382578 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.382558 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:28:58.382646 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.382593 2566 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:28:58.382646 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.382601 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:28:58.385049 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.385017 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:28:58.385214 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.385199 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:28:58.388256 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.388240 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:28:58.389612 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.389599 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:28:58.391326 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391309 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:28:58.391326 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391328 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391334 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391340 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391361 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391368 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391374 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391379 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391387 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391392 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391401 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:28:58.391458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.391410 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:28:58.392121 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.392111 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:28:58.392121 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.392121 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:28:58.395591 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.395577 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:28:58.395664 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.395610 2566 server.go:1295] "Started kubelet" Apr 24 21:28:58.395716 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.395676 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:28:58.395785 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.395709 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:28:58.395844 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.395811 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:28:58.396157 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.396131 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-27.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:28:58.396287 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.396230 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-27.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:28:58.396426 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.396405 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:28:58.396529 ip-10-0-135-27 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:28:58.397013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.396960 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:28:58.397013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.396982 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:28:58.399185 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.399167 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zztsp" Apr 24 21:28:58.403165 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.403147 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:28:58.403590 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.403575 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:28:58.404328 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404308 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:28:58.404498 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404486 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:28:58.404609 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.403523 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-27.ec2.internal.18a9683cd171b5db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-27.ec2.internal,UID:ip-10-0-135-27.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-27.ec2.internal,},FirstTimestamp:2026-04-24 21:28:58.395588059 +0000 UTC m=+0.357353070,LastTimestamp:2026-04-24 21:28:58.395588059 +0000 UTC m=+0.357353070,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-27.ec2.internal,}" Apr 24 21:28:58.404609 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.404420 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:58.404609 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404591 2566 factory.go:55] Registering systemd factory Apr 24 21:28:58.404836 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404625 2566 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:28:58.404836 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404424 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:28:58.404836 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404705 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:28:58.404836 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404714 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:28:58.404836 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404811 2566 factory.go:153] Registering CRI-O factory Apr 24 21:28:58.404836 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404820 2566 factory.go:223] Registration of the crio container factory successfully Apr 24 21:28:58.405122 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404861 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:28:58.405122 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404880 2566 factory.go:103] Registering Raw factory Apr 24 21:28:58.405122 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.404890 2566 manager.go:1196] Started watching for new ooms in manager Apr 24 21:28:58.406020 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.405992 2566 manager.go:319] Starting recovery of all containers Apr 24 21:28:58.407836 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.407724 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zztsp" Apr 24 21:28:58.409481 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.409456 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-27.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:28:58.409583 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.409564 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:28:58.417289 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.417182 2566 manager.go:324] Recovery completed Apr 24 21:28:58.421265 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.421253 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:58.423580 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.423564 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:58.423637 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.423591 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:58.423637 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.423602 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:58.424022 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.424009 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:28:58.424022 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.424019 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:28:58.424102 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.424032 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:28:58.425740 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.425681 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-27.ec2.internal.18a9683cd31cd19a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-27.ec2.internal,UID:ip-10-0-135-27.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-27.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-27.ec2.internal,},FirstTimestamp:2026-04-24 21:28:58.423579034 +0000 UTC m=+0.385344047,LastTimestamp:2026-04-24 21:28:58.423579034 +0000 UTC m=+0.385344047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-27.ec2.internal,}" Apr 24 21:28:58.426537 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.426521 2566 policy_none.go:49] "None policy: Start" Apr 24 21:28:58.426537 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.426538 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:28:58.426642 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.426549 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:28:58.457942 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.457925 2566 manager.go:341] "Starting Device Plugin manager" Apr 24 21:28:58.458024 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.457975 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:28:58.458024 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.457988 2566 server.go:85] "Starting device plugin registration server" Apr 24 21:28:58.458215 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.458201 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:28:58.458296 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.458217 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:28:58.458776 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.458415 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:28:58.458776 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.458504 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:28:58.458776 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.458512 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:28:58.459195 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.458930 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:28:58.459195 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.458986 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:58.532693 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.532669 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:28:58.534739 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.533859 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:28:58.534739 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.533885 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:28:58.534739 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.533902 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:28:58.534739 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.533908 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:28:58.534739 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.533945 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:28:58.536275 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.536258 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:58.559245 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.559204 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:58.560045 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.560030 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:58.560103 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.560057 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:58.560103 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.560069 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:58.560103 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.560089 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.568484 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.568469 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.568539 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.568489 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-27.ec2.internal\": node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:58.594424 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.594403 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:58.634194 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.634167 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal"] Apr 24 21:28:58.634268 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.634227 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:58.635599 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.635584 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:58.635676 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.635611 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:58.635676 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.635620 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:58.637750 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.637738 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:58.637882 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.637868 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.637931 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.637896 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:58.638392 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.638378 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:58.638460 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.638382 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:58.638460 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.638422 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:58.638460 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.638432 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:58.638560 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.638402 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:58.638560 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.638487 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:58.640525 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.640512 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.640569 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.640534 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:58.641077 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.641058 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:58.641157 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.641088 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:58.641157 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.641104 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:58.668853 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.668829 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-27.ec2.internal\" not found" node="ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.673034 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.673018 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-27.ec2.internal\" not found" node="ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.694572 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.694553 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:58.706164 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.706146 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3cc646a1a5872f0951b32e3c6a73034c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal\" (UID: \"3cc646a1a5872f0951b32e3c6a73034c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.706218 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.706169 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cc646a1a5872f0951b32e3c6a73034c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal\" (UID: \"3cc646a1a5872f0951b32e3c6a73034c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.706218 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.706186 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/737b9e94dfd820d8429e803872b0624c-config\") pod \"kube-apiserver-proxy-ip-10-0-135-27.ec2.internal\" (UID: \"737b9e94dfd820d8429e803872b0624c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.794816 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.794795 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:58.807220 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.807200 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/737b9e94dfd820d8429e803872b0624c-config\") pod \"kube-apiserver-proxy-ip-10-0-135-27.ec2.internal\" (UID: \"737b9e94dfd820d8429e803872b0624c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.807272 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.807231 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/737b9e94dfd820d8429e803872b0624c-config\") pod \"kube-apiserver-proxy-ip-10-0-135-27.ec2.internal\" (UID: \"737b9e94dfd820d8429e803872b0624c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.807272 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.807266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3cc646a1a5872f0951b32e3c6a73034c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal\" (UID: \"3cc646a1a5872f0951b32e3c6a73034c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.807332 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.807285 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cc646a1a5872f0951b32e3c6a73034c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal\" (UID: \"3cc646a1a5872f0951b32e3c6a73034c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.807332 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.807313 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3cc646a1a5872f0951b32e3c6a73034c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal\" (UID: \"3cc646a1a5872f0951b32e3c6a73034c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.807409 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.807376 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3cc646a1a5872f0951b32e3c6a73034c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal\" (UID: \"3cc646a1a5872f0951b32e3c6a73034c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.895609 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.895563 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:58.971064 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.971033 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.975501 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:58.975485 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" Apr 24 21:28:58.996138 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:58.996118 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:59.096853 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:59.096827 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:59.197448 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:59.197390 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:59.297952 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:59.297927 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:59.322418 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.322390 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:28:59.322899 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.322563 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:28:59.398196 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:59.398174 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:59.403248 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.403230 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:28:59.414027 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.414007 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:59.415744 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.415715 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:23:58 +0000 UTC" deadline="2027-12-27 08:53:28.996088755 +0000 UTC" Apr 24 21:28:59.415744 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.415743 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14675h24m29.580348525s" Apr 24 21:28:59.419721 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.419702 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:28:59.443476 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.443454 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kr7r8" Apr 24 21:28:59.452021 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.451975 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kr7r8" Apr 24 21:28:59.497831 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:59.497806 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cc646a1a5872f0951b32e3c6a73034c.slice/crio-df4e4ca7db183ef8264b59c32068bf551acb0a21b0317f1f990dc5cf8c8337ff WatchSource:0}: Error finding container df4e4ca7db183ef8264b59c32068bf551acb0a21b0317f1f990dc5cf8c8337ff: Status 404 returned error can't find the container with id df4e4ca7db183ef8264b59c32068bf551acb0a21b0317f1f990dc5cf8c8337ff Apr 24 21:28:59.498258 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:28:59.498235 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod737b9e94dfd820d8429e803872b0624c.slice/crio-1d73a3dc24de1c6cb78563b85c1ca6141c31c5386dce7f2d7c18010f04c5d4fb WatchSource:0}: Error finding container 1d73a3dc24de1c6cb78563b85c1ca6141c31c5386dce7f2d7c18010f04c5d4fb: Status 404 returned error can't find the container with id 1d73a3dc24de1c6cb78563b85c1ca6141c31c5386dce7f2d7c18010f04c5d4fb Apr 24 21:28:59.498258 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:59.498245 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:59.503553 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.503533 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:28:59.536375 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.536324 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" event={"ID":"737b9e94dfd820d8429e803872b0624c","Type":"ContainerStarted","Data":"1d73a3dc24de1c6cb78563b85c1ca6141c31c5386dce7f2d7c18010f04c5d4fb"} Apr 24 21:28:59.537284 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.537263 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" event={"ID":"3cc646a1a5872f0951b32e3c6a73034c","Type":"ContainerStarted","Data":"df4e4ca7db183ef8264b59c32068bf551acb0a21b0317f1f990dc5cf8c8337ff"} Apr 24 21:28:59.598463 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:59.598442 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:59.698979 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:28:59.698956 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-27.ec2.internal\" not found" Apr 24 21:28:59.703906 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.703866 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:59.803849 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.803829 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" Apr 24 21:28:59.815691 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.815666 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:28:59.819088 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.819066 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" Apr 24 21:28:59.827911 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.827890 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:28:59.905731 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:28:59.905705 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:29:00.384624 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.384598 2566 apiserver.go:52] "Watching apiserver" Apr 24 21:29:00.391843 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.391822 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:29:00.392914 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.392889 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-ltjjp","openshift-dns/node-resolver-88hpr","openshift-multus/multus-additional-cni-plugins-m5nxz","openshift-multus/network-metrics-daemon-q52j5","kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g","openshift-image-registry/node-ca-4rz25","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal","openshift-multus/multus-n6qzm","openshift-network-diagnostics/network-check-target-tpck9","openshift-network-operator/iptables-alerter-p55z9","openshift-ovn-kubernetes/ovnkube-node-qccqq","kube-system/konnectivity-agent-2qsqx"] Apr 24 21:29:00.397848 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.397824 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.399990 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.399935 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.400239 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.400203 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:29:00.400316 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.400239 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:29:00.400378 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.400325 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zsbhm\"" Apr 24 21:29:00.400464 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.400444 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:29:00.401946 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.401914 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pgp7d\"" Apr 24 21:29:00.402105 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.402084 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.402700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.402674 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:29:00.402700 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.402700 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:29:00.404114 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.404096 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:29:00.404278 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.404256 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:29:00.404409 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.404392 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:29:00.404549 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.404531 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:00.404647 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:00.404612 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:00.404647 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.404641 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.404967 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.404914 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:29:00.405176 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.405145 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:29:00.405255 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.405203 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zlgt8\"" Apr 24 21:29:00.407240 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.407179 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:29:00.407240 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.407191 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:29:00.407240 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.407178 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.407240 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.407227 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:29:00.407465 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.407182 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4nqjp\"" Apr 24 21:29:00.409471 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.409404 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:00.409471 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.409432 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-hf4gz\"" Apr 24 21:29:00.409627 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.409530 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:00.409675 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.409629 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.411492 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.411476 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j6dh8\"" Apr 24 21:29:00.411793 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.411774 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:29:00.412029 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.412011 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:00.412123 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:00.412070 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:00.414221 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.414202 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.416390 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.416370 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:00.416528 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.416372 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ca03565-9d72-4813-9179-7636908c9bf5-hosts-file\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.416528 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.416417 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysconfig\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.416528 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.416443 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-sys\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.416528 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.416503 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/918a384b-26d7-496e-b3c9-370b6e526ebd-host\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.416721 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.416535 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-sys-fs\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.416721 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.416559 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-modprobe-d\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.417059 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417025 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-tmp\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.417152 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417079 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hs9d\" (UniqueName: \"kubernetes.io/projected/e2991f6d-3be3-4ee8-afa1-f01aef64092a-kube-api-access-7hs9d\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.417243 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417121 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgxn\" (UniqueName: \"kubernetes.io/projected/4ca03565-9d72-4813-9179-7636908c9bf5-kube-api-access-hmgxn\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.417307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417260 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-cnibin\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.417307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417292 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.417307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417291 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:29:00.417497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417082 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f4t5m\"" Apr 24 21:29:00.417497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417323 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-registration-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.417497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417374 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-host\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.417497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417380 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:00.417497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417406 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/918a384b-26d7-496e-b3c9-370b6e526ebd-serviceca\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.417497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417432 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-system-cni-dir\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.417497 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417463 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-var-lib-kubelet\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.417838 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417556 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-os-release\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.417838 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417588 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.417838 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417619 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-etc-selinux\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.417838 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417652 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.417838 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417653 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysctl-conf\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.417838 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417832 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-systemd\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.418165 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417870 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-run\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.418165 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417908 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq9cn\" (UniqueName: \"kubernetes.io/projected/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-kube-api-access-gq9cn\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.418165 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ca03565-9d72-4813-9179-7636908c9bf5-tmp-dir\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.418165 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.417977 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.418165 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418036 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9hps\" (UniqueName: \"kubernetes.io/projected/126ae8c8-5c56-4044-b72b-57fc091713c4-kube-api-access-b9hps\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.418165 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418074 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-device-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.418470 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418107 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5r9w\" (UniqueName: \"kubernetes.io/projected/918a384b-26d7-496e-b3c9-370b6e526ebd-kube-api-access-z5r9w\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.418470 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418233 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.418470 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418266 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.418470 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418298 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-socket-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.418643 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418503 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-tuned\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.418643 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418539 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:00.418643 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418569 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfqxb\" (UniqueName: \"kubernetes.io/projected/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-kube-api-access-kfqxb\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:00.418828 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418671 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-kubernetes\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.418828 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418705 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysctl-d\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.418828 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.418746 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-lib-modules\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.419189 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.419168 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:29:00.421034 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.420839 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:29:00.421034 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.420931 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mk6gs\"" Apr 24 21:29:00.421034 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.420934 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:00.421034 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.421012 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:29:00.421335 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.421317 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:29:00.421417 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.421367 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:29:00.421417 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.421380 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:29:00.422224 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.422202 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:29:00.423318 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.423286 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:29:00.423427 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.423375 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:29:00.423489 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.423377 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v6jc2\"" Apr 24 21:29:00.452613 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.452586 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:23:59 +0000 UTC" deadline="2027-09-21 00:59:09.232224552 +0000 UTC" Apr 24 21:29:00.452613 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.452612 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12339h30m8.779615689s" Apr 24 21:29:00.505890 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.505868 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:29:00.519527 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519503 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-systemd\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.519638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-run\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.519638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519562 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-os-release\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.519638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519587 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-k8s-cni-cncf-io\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.519638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519612 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-slash\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.519638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519627 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-systemd\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.519638 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519634 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-node-log\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519660 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ca03565-9d72-4813-9179-7636908c9bf5-tmp-dir\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519703 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-run\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519730 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-kubelet\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519746 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55ef112d-fe31-4b57-808d-d33898e3e457-multus-daemon-config\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519768 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-etc-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519795 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519829 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519855 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-socket-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519887 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-tuned\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.519915 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519908 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-run-netns\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519931 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e9028851-049a-4814-809a-8ffbb08d8ce7-konnectivity-ca\") pod \"konnectivity-agent-2qsqx\" (UID: \"e9028851-049a-4814-809a-8ffbb08d8ce7\") " pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519957 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfqxb\" (UniqueName: \"kubernetes.io/projected/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-kube-api-access-kfqxb\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519974 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ca03565-9d72-4813-9179-7636908c9bf5-tmp-dir\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.519990 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-kubernetes\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520013 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysctl-d\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520057 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-lib-modules\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520086 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-cni-multus\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520130 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-log-socket\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520157 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysctl-d\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520201 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ca03565-9d72-4813-9179-7636908c9bf5-hosts-file\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.520383 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520270 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-lib-modules\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.520879 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520435 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.520879 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520536 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-socket-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.520879 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520730 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.520879 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520828 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:29:00.520879 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520870 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-kubernetes\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520157 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4ca03565-9d72-4813-9179-7636908c9bf5-hosts-file\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520920 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysconfig\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520946 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-sys\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520975 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-systemd-units\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.520998 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovnkube-script-lib\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521026 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/918a384b-26d7-496e-b3c9-370b6e526ebd-host\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521045 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-sys\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521049 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-sys-fs\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521090 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-modprobe-d\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.521101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521096 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-sys-fs\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521113 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-tmp\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521139 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-cni-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521152 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/918a384b-26d7-496e-b3c9-370b6e526ebd-host\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521160 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55ef112d-fe31-4b57-808d-d33898e3e457-cni-binary-copy\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521183 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-multus-certs\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521204 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-conf-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521231 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-modprobe-d\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521231 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hs9d\" (UniqueName: \"kubernetes.io/projected/e2991f6d-3be3-4ee8-afa1-f01aef64092a-kube-api-access-7hs9d\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521449 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysconfig\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521458 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgxn\" (UniqueName: \"kubernetes.io/projected/4ca03565-9d72-4813-9179-7636908c9bf5-kube-api-access-hmgxn\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.521504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521502 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-cnibin\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521565 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-socket-dir-parent\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521591 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/918a384b-26d7-496e-b3c9-370b6e526ebd-serviceca\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521613 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-system-cni-dir\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521615 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-cnibin\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521639 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-var-lib-kubelet\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521665 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-system-cni-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521680 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521704 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82cb470a-4ce5-4007-a453-0ac73804ef24-host-slash\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521757 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-var-lib-kubelet\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521808 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521820 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-system-cni-dir\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521840 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysctl-conf\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521866 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq9cn\" (UniqueName: \"kubernetes.io/projected/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-kube-api-access-gq9cn\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521890 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-hostroot\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521912 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-kubelet\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.522013 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521944 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovn-node-metrics-cert\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521960 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-sysctl-conf\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.521971 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zg7\" (UniqueName: \"kubernetes.io/projected/5a001c67-cdac-4386-8012-1386fdcf8bbd-kube-api-access-84zg7\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522000 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522004 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/918a384b-26d7-496e-b3c9-370b6e526ebd-serviceca\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522027 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9hps\" (UniqueName: \"kubernetes.io/projected/126ae8c8-5c56-4044-b72b-57fc091713c4-kube-api-access-b9hps\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522055 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-netns\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522073 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-ovn\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522089 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-cni-bin\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522103 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-cni-netd\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522118 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e9028851-049a-4814-809a-8ffbb08d8ce7-agent-certs\") pod \"konnectivity-agent-2qsqx\" (UID: \"e9028851-049a-4814-809a-8ffbb08d8ce7\") " pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522135 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-device-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522152 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5r9w\" (UniqueName: \"kubernetes.io/projected/918a384b-26d7-496e-b3c9-370b6e526ebd-kube-api-access-z5r9w\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522172 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522197 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522221 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-cni-bin\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522245 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-etc-kubernetes\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.522777 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522299 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qzk\" (UniqueName: \"kubernetes.io/projected/55ef112d-fe31-4b57-808d-d33898e3e457-kube-api-access-65qzk\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522326 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522331 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovnkube-config\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522388 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-device-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522397 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/82cb470a-4ce5-4007-a453-0ac73804ef24-iptables-alerter-script\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522423 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-systemd\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522452 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-registration-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522475 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-host\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522502 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-cnibin\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9gw2\" (UniqueName: \"kubernetes.io/projected/82cb470a-4ce5-4007-a453-0ac73804ef24-kube-api-access-w9gw2\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522548 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-var-lib-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522584 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522606 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522624 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-host\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522628 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522631 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/126ae8c8-5c56-4044-b72b-57fc091713c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522584 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-registration-dir\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.523532 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:00.522647 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:00.524254 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522676 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-env-overrides\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.524254 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522745 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-os-release\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.524254 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:00.522781 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.022737317 +0000 UTC m=+2.984502336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:00.524254 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522802 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/126ae8c8-5c56-4044-b72b-57fc091713c4-os-release\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.524254 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522816 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-etc-selinux\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.524254 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.522947 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e2991f6d-3be3-4ee8-afa1-f01aef64092a-etc-selinux\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.524475 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.524457 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-tmp\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.524475 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.524470 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-etc-tuned\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.533171 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.533147 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfqxb\" (UniqueName: \"kubernetes.io/projected/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-kube-api-access-kfqxb\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:00.534398 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.534373 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hs9d\" (UniqueName: \"kubernetes.io/projected/e2991f6d-3be3-4ee8-afa1-f01aef64092a-kube-api-access-7hs9d\") pod \"aws-ebs-csi-driver-node-z4m8g\" (UID: \"e2991f6d-3be3-4ee8-afa1-f01aef64092a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.534755 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.534732 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgxn\" (UniqueName: \"kubernetes.io/projected/4ca03565-9d72-4813-9179-7636908c9bf5-kube-api-access-hmgxn\") pod \"node-resolver-88hpr\" (UID: \"4ca03565-9d72-4813-9179-7636908c9bf5\") " pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.534916 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.534890 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5r9w\" (UniqueName: \"kubernetes.io/projected/918a384b-26d7-496e-b3c9-370b6e526ebd-kube-api-access-z5r9w\") pod \"node-ca-4rz25\" (UID: \"918a384b-26d7-496e-b3c9-370b6e526ebd\") " pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.535209 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.535182 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9hps\" (UniqueName: \"kubernetes.io/projected/126ae8c8-5c56-4044-b72b-57fc091713c4-kube-api-access-b9hps\") pod \"multus-additional-cni-plugins-m5nxz\" (UID: \"126ae8c8-5c56-4044-b72b-57fc091713c4\") " pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.535209 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.535201 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq9cn\" (UniqueName: \"kubernetes.io/projected/ce3cc5f8-62fb-41a2-9be9-2e22c637379e-kube-api-access-gq9cn\") pod \"tuned-ltjjp\" (UID: \"ce3cc5f8-62fb-41a2-9be9-2e22c637379e\") " pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.623432 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623401 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9gw2\" (UniqueName: \"kubernetes.io/projected/82cb470a-4ce5-4007-a453-0ac73804ef24-kube-api-access-w9gw2\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.623591 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623440 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-var-lib-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623591 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623462 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623591 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-var-lib-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623591 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623575 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623608 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623636 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623659 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-env-overrides\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623673 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623684 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-os-release\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-k8s-cni-cncf-io\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623722 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623734 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-slash\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623756 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-node-log\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623781 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-kubelet\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.623810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623801 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-os-release\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623812 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-k8s-cni-cncf-io\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623821 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-kubelet\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55ef112d-fe31-4b57-808d-d33898e3e457-multus-daemon-config\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623845 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-node-log\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623873 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-slash\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623932 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-etc-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623965 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-run-netns\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.623991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e9028851-049a-4814-809a-8ffbb08d8ce7-konnectivity-ca\") pod \"konnectivity-agent-2qsqx\" (UID: \"e9028851-049a-4814-809a-8ffbb08d8ce7\") " pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-cni-multus\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624029 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-etc-openvswitch\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624048 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-log-socket\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624055 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-run-netns\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624076 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-systemd-units\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624112 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovnkube-script-lib\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624118 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-cni-multus\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624123 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-log-socket\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624142 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-cni-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.624307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624161 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-systemd-units\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55ef112d-fe31-4b57-808d-d33898e3e457-cni-binary-copy\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624158 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-env-overrides\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624205 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-multus-certs\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624220 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-cni-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624234 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-conf-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624258 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-multus-certs\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624263 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-socket-dir-parent\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624290 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-system-cni-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624295 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-conf-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624315 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82cb470a-4ce5-4007-a453-0ac73804ef24-host-slash\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624340 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-multus-socket-dir-parent\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624344 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-hostroot\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624378 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55ef112d-fe31-4b57-808d-d33898e3e457-multus-daemon-config\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624384 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-system-cni-dir\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624397 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-kubelet\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624395 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82cb470a-4ce5-4007-a453-0ac73804ef24-host-slash\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624432 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovn-node-metrics-cert\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625104 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624434 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-hostroot\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624437 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-kubelet\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624469 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84zg7\" (UniqueName: \"kubernetes.io/projected/5a001c67-cdac-4386-8012-1386fdcf8bbd-kube-api-access-84zg7\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624500 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-netns\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624526 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-ovn\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624535 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e9028851-049a-4814-809a-8ffbb08d8ce7-konnectivity-ca\") pod \"konnectivity-agent-2qsqx\" (UID: \"e9028851-049a-4814-809a-8ffbb08d8ce7\") " pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624551 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-cni-bin\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624588 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-run-netns\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624592 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-cni-bin\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624605 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-ovn\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624611 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-cni-netd\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55ef112d-fe31-4b57-808d-d33898e3e457-cni-binary-copy\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624636 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e9028851-049a-4814-809a-8ffbb08d8ce7-agent-certs\") pod \"konnectivity-agent-2qsqx\" (UID: \"e9028851-049a-4814-809a-8ffbb08d8ce7\") " pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624690 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624707 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-host-cni-netd\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624719 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-cni-bin\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624770 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-etc-kubernetes\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624797 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65qzk\" (UniqueName: \"kubernetes.io/projected/55ef112d-fe31-4b57-808d-d33898e3e457-kube-api-access-65qzk\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.625757 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovnkube-config\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624845 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/82cb470a-4ce5-4007-a453-0ac73804ef24-iptables-alerter-script\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-systemd\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-cnibin\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.624980 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-cnibin\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.625098 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-host-var-lib-cni-bin\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.625149 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55ef112d-fe31-4b57-808d-d33898e3e457-etc-kubernetes\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.625172 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovnkube-script-lib\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.625239 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a001c67-cdac-4386-8012-1386fdcf8bbd-run-systemd\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.625466 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovnkube-config\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.626431 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.626144 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/82cb470a-4ce5-4007-a453-0ac73804ef24-iptables-alerter-script\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.626956 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.626937 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a001c67-cdac-4386-8012-1386fdcf8bbd-ovn-node-metrics-cert\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.627077 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.627058 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e9028851-049a-4814-809a-8ffbb08d8ce7-agent-certs\") pod \"konnectivity-agent-2qsqx\" (UID: \"e9028851-049a-4814-809a-8ffbb08d8ce7\") " pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:00.630888 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:00.630866 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:00.630888 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:00.630888 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:00.631030 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:00.630902 2566 projected.go:194] Error preparing data for projected volume kube-api-access-b75lj for pod openshift-network-diagnostics/network-check-target-tpck9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:00.631030 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:00.630969 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj podName:9e66f6e5-929b-4807-810a-ad84e15bb98f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.130952735 +0000 UTC m=+3.092717735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b75lj" (UniqueName: "kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj") pod "network-check-target-tpck9" (UID: "9e66f6e5-929b-4807-810a-ad84e15bb98f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:00.631633 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.631608 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9gw2\" (UniqueName: \"kubernetes.io/projected/82cb470a-4ce5-4007-a453-0ac73804ef24-kube-api-access-w9gw2\") pod \"iptables-alerter-p55z9\" (UID: \"82cb470a-4ce5-4007-a453-0ac73804ef24\") " pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.633005 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.632982 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zg7\" (UniqueName: \"kubernetes.io/projected/5a001c67-cdac-4386-8012-1386fdcf8bbd-kube-api-access-84zg7\") pod \"ovnkube-node-qccqq\" (UID: \"5a001c67-cdac-4386-8012-1386fdcf8bbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.633210 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.633192 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qzk\" (UniqueName: \"kubernetes.io/projected/55ef112d-fe31-4b57-808d-d33898e3e457-kube-api-access-65qzk\") pod \"multus-n6qzm\" (UID: \"55ef112d-fe31-4b57-808d-d33898e3e457\") " pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.709005 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.708932 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4rz25" Apr 24 21:29:00.716615 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.716597 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-88hpr" Apr 24 21:29:00.727266 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.727245 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" Apr 24 21:29:00.732787 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.732768 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:00.740381 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.740362 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" Apr 24 21:29:00.746872 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.746856 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" Apr 24 21:29:00.753340 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.753326 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n6qzm" Apr 24 21:29:00.760824 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.760807 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p55z9" Apr 24 21:29:00.765382 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:00.765362 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:01.027163 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.027125 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:01.027335 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:01.027305 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:01.027402 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:01.027376 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:02.027359802 +0000 UTC m=+3.989124813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:01.123094 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.123066 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a001c67_cdac_4386_8012_1386fdcf8bbd.slice/crio-d2076f856b16488eaf8a328c8ab67317f5e771d43a6e6e3530514c5e0d437ad8 WatchSource:0}: Error finding container d2076f856b16488eaf8a328c8ab67317f5e771d43a6e6e3530514c5e0d437ad8: Status 404 returned error can't find the container with id d2076f856b16488eaf8a328c8ab67317f5e771d43a6e6e3530514c5e0d437ad8 Apr 24 21:29:01.123707 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.123674 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3cc5f8_62fb_41a2_9be9_2e22c637379e.slice/crio-8df3844f91f6c0b2480027d8dff638a8ba1b39de7a268cdd45370a48b85f28ff WatchSource:0}: Error finding container 8df3844f91f6c0b2480027d8dff638a8ba1b39de7a268cdd45370a48b85f28ff: Status 404 returned error can't find the container with id 8df3844f91f6c0b2480027d8dff638a8ba1b39de7a268cdd45370a48b85f28ff Apr 24 21:29:01.125783 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.125747 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9028851_049a_4814_809a_8ffbb08d8ce7.slice/crio-48829adb7ec7dd641773b6c1654a450b71e78d4569175a0baa614df950c0057d WatchSource:0}: Error finding container 48829adb7ec7dd641773b6c1654a450b71e78d4569175a0baa614df950c0057d: Status 404 returned error can't find the container with id 48829adb7ec7dd641773b6c1654a450b71e78d4569175a0baa614df950c0057d Apr 24 21:29:01.128149 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.128113 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod918a384b_26d7_496e_b3c9_370b6e526ebd.slice/crio-0df6dcf8c0378ebbbf9748d9065c44877c2fe38e3efbd7b8cfb325557e0b12c8 WatchSource:0}: Error finding container 0df6dcf8c0378ebbbf9748d9065c44877c2fe38e3efbd7b8cfb325557e0b12c8: Status 404 returned error can't find the container with id 0df6dcf8c0378ebbbf9748d9065c44877c2fe38e3efbd7b8cfb325557e0b12c8 Apr 24 21:29:01.128949 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.128925 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca03565_9d72_4813_9179_7636908c9bf5.slice/crio-93dc4cdf496792d353dc8520aa3464da69de158becc45f2261cd470cc1c68662 WatchSource:0}: Error finding container 93dc4cdf496792d353dc8520aa3464da69de158becc45f2261cd470cc1c68662: Status 404 returned error can't find the container with id 93dc4cdf496792d353dc8520aa3464da69de158becc45f2261cd470cc1c68662 Apr 24 21:29:01.130432 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.129723 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82cb470a_4ce5_4007_a453_0ac73804ef24.slice/crio-9c8224e4526c4b46414d0313261303070df5786b9fb69bb3571e9bec25853cd4 WatchSource:0}: Error finding container 9c8224e4526c4b46414d0313261303070df5786b9fb69bb3571e9bec25853cd4: Status 404 returned error can't find the container with id 9c8224e4526c4b46414d0313261303070df5786b9fb69bb3571e9bec25853cd4 Apr 24 21:29:01.130812 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.130623 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2991f6d_3be3_4ee8_afa1_f01aef64092a.slice/crio-3e89d43b8081e022109f22fca713dc945dfaef4f9604e4c246b4d30905249d23 WatchSource:0}: Error finding container 3e89d43b8081e022109f22fca713dc945dfaef4f9604e4c246b4d30905249d23: Status 404 returned error can't find the container with id 3e89d43b8081e022109f22fca713dc945dfaef4f9604e4c246b4d30905249d23 Apr 24 21:29:01.131643 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.131561 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ef112d_fe31_4b57_808d_d33898e3e457.slice/crio-58ca5392fc1129bba3949b8be1bf1ce6bbc79194e14ff03e084001b1d4445e3f WatchSource:0}: Error finding container 58ca5392fc1129bba3949b8be1bf1ce6bbc79194e14ff03e084001b1d4445e3f: Status 404 returned error can't find the container with id 58ca5392fc1129bba3949b8be1bf1ce6bbc79194e14ff03e084001b1d4445e3f Apr 24 21:29:01.134870 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:01.134849 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod126ae8c8_5c56_4044_b72b_57fc091713c4.slice/crio-7f4414eaf8cdefc883c5c5f50a56e6c1b81a2819b923a590ad8661459127f607 WatchSource:0}: Error finding container 7f4414eaf8cdefc883c5c5f50a56e6c1b81a2819b923a590ad8661459127f607: Status 404 returned error can't find the container with id 7f4414eaf8cdefc883c5c5f50a56e6c1b81a2819b923a590ad8661459127f607 Apr 24 21:29:01.228295 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.228137 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:01.228404 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:01.228274 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:01.228404 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:01.228384 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:01.228404 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:01.228396 2566 projected.go:194] Error preparing data for projected volume kube-api-access-b75lj for pod openshift-network-diagnostics/network-check-target-tpck9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:01.228627 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:01.228453 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj podName:9e66f6e5-929b-4807-810a-ad84e15bb98f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:02.22842751 +0000 UTC m=+4.190192510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-b75lj" (UniqueName: "kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj") pod "network-check-target-tpck9" (UID: "9e66f6e5-929b-4807-810a-ad84e15bb98f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:01.452841 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.452742 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:23:59 +0000 UTC" deadline="2028-02-01 04:18:11.943950128 +0000 UTC" Apr 24 21:29:01.452841 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.452778 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15534h49m10.491176132s" Apr 24 21:29:01.535462 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.534960 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:01.535462 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:01.535088 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:01.559404 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.559368 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"d2076f856b16488eaf8a328c8ab67317f5e771d43a6e6e3530514c5e0d437ad8"} Apr 24 21:29:01.572563 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.572505 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n6qzm" event={"ID":"55ef112d-fe31-4b57-808d-d33898e3e457","Type":"ContainerStarted","Data":"58ca5392fc1129bba3949b8be1bf1ce6bbc79194e14ff03e084001b1d4445e3f"} Apr 24 21:29:01.575534 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.575471 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" event={"ID":"e2991f6d-3be3-4ee8-afa1-f01aef64092a","Type":"ContainerStarted","Data":"3e89d43b8081e022109f22fca713dc945dfaef4f9604e4c246b4d30905249d23"} Apr 24 21:29:01.580276 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.580219 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p55z9" event={"ID":"82cb470a-4ce5-4007-a453-0ac73804ef24","Type":"ContainerStarted","Data":"9c8224e4526c4b46414d0313261303070df5786b9fb69bb3571e9bec25853cd4"} Apr 24 21:29:01.584538 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.584474 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-88hpr" event={"ID":"4ca03565-9d72-4813-9179-7636908c9bf5","Type":"ContainerStarted","Data":"93dc4cdf496792d353dc8520aa3464da69de158becc45f2261cd470cc1c68662"} Apr 24 21:29:01.602220 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.602173 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" event={"ID":"737b9e94dfd820d8429e803872b0624c","Type":"ContainerStarted","Data":"2a218c0e8bda2e171cb9b6e35709dd0dce351232bfe09f462845d5369f65f7c7"} Apr 24 21:29:01.616843 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.616792 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" event={"ID":"126ae8c8-5c56-4044-b72b-57fc091713c4","Type":"ContainerStarted","Data":"7f4414eaf8cdefc883c5c5f50a56e6c1b81a2819b923a590ad8661459127f607"} Apr 24 21:29:01.619555 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.618465 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-27.ec2.internal" podStartSLOduration=2.618450399 podStartE2EDuration="2.618450399s" podCreationTimestamp="2026-04-24 21:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:01.618366192 +0000 UTC m=+3.580131205" watchObservedRunningTime="2026-04-24 21:29:01.618450399 +0000 UTC m=+3.580215402" Apr 24 21:29:01.624616 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.624566 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4rz25" event={"ID":"918a384b-26d7-496e-b3c9-370b6e526ebd","Type":"ContainerStarted","Data":"0df6dcf8c0378ebbbf9748d9065c44877c2fe38e3efbd7b8cfb325557e0b12c8"} Apr 24 21:29:01.628277 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.628097 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" event={"ID":"ce3cc5f8-62fb-41a2-9be9-2e22c637379e","Type":"ContainerStarted","Data":"8df3844f91f6c0b2480027d8dff638a8ba1b39de7a268cdd45370a48b85f28ff"} Apr 24 21:29:01.633597 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:01.633556 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2qsqx" event={"ID":"e9028851-049a-4814-809a-8ffbb08d8ce7","Type":"ContainerStarted","Data":"48829adb7ec7dd641773b6c1654a450b71e78d4569175a0baa614df950c0057d"} Apr 24 21:29:02.037172 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:02.036598 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:02.037172 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:02.036775 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:02.037172 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:02.036840 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.036821018 +0000 UTC m=+5.998586038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:02.238132 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:02.238096 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:02.238289 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:02.238273 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:02.238370 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:02.238297 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:02.238370 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:02.238310 2566 projected.go:194] Error preparing data for projected volume kube-api-access-b75lj for pod openshift-network-diagnostics/network-check-target-tpck9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:02.238482 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:02.238385 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj podName:9e66f6e5-929b-4807-810a-ad84e15bb98f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.238366058 +0000 UTC m=+6.200131077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-b75lj" (UniqueName: "kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj") pod "network-check-target-tpck9" (UID: "9e66f6e5-929b-4807-810a-ad84e15bb98f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:02.537121 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:02.537089 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:02.537558 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:02.537220 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:02.659663 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:02.659627 2566 generic.go:358] "Generic (PLEG): container finished" podID="3cc646a1a5872f0951b32e3c6a73034c" containerID="102532cb8a1eba04b81c4135b09da81999040737778fa87da7a8c30179992ffe" exitCode=0 Apr 24 21:29:02.659835 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:02.659794 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" event={"ID":"3cc646a1a5872f0951b32e3c6a73034c","Type":"ContainerDied","Data":"102532cb8a1eba04b81c4135b09da81999040737778fa87da7a8c30179992ffe"} Apr 24 21:29:03.535062 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:03.534576 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:03.535062 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:03.534706 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:03.668214 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:03.668181 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" event={"ID":"3cc646a1a5872f0951b32e3c6a73034c","Type":"ContainerStarted","Data":"713fb2ab4934e60b27a83bc7947670a66aeefa2bc08bf7eb9f27d06d3c9264a2"} Apr 24 21:29:04.054009 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:04.053972 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:04.054637 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:04.054175 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:04.054637 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:04.054241 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:08.05422431 +0000 UTC m=+10.015989314 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:04.256751 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:04.256655 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:04.256893 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:04.256783 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:04.256893 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:04.256805 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:04.256893 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:04.256817 2566 projected.go:194] Error preparing data for projected volume kube-api-access-b75lj for pod openshift-network-diagnostics/network-check-target-tpck9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:04.256893 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:04.256868 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj podName:9e66f6e5-929b-4807-810a-ad84e15bb98f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:08.256850594 +0000 UTC m=+10.218615598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-b75lj" (UniqueName: "kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj") pod "network-check-target-tpck9" (UID: "9e66f6e5-929b-4807-810a-ad84e15bb98f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:04.534978 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:04.534791 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:04.534978 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:04.534915 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:05.534518 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:05.534424 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:05.534950 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:05.534564 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:06.534689 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:06.534659 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:06.535094 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:06.534766 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:07.534154 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:07.534117 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:07.534319 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:07.534257 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:08.086159 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:08.086031 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:08.086654 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:08.086199 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:08.086654 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:08.086272 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:16.086252038 +0000 UTC m=+18.048017048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:08.287604 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:08.287505 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:08.287797 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:08.287668 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:08.287797 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:08.287688 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:08.287797 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:08.287699 2566 projected.go:194] Error preparing data for projected volume kube-api-access-b75lj for pod openshift-network-diagnostics/network-check-target-tpck9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:08.287797 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:08.287760 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj podName:9e66f6e5-929b-4807-810a-ad84e15bb98f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:16.287741787 +0000 UTC m=+18.249506785 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-b75lj" (UniqueName: "kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj") pod "network-check-target-tpck9" (UID: "9e66f6e5-929b-4807-810a-ad84e15bb98f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:08.535061 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:08.535031 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:08.535233 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:08.535126 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:09.534115 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:09.534083 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:09.534612 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:09.534209 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:10.534338 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:10.534303 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:10.534782 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:10.534436 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:11.534284 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:11.534257 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:11.534464 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:11.534423 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:12.534184 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:12.534152 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:12.534373 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:12.534267 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:13.534815 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:13.534788 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:13.535197 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:13.534907 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:14.534203 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:14.534173 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:14.534384 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:14.534271 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:15.534732 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:15.534702 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:15.535108 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:15.534837 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:16.142362 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:16.142322 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:16.142532 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:16.142473 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:16.142581 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:16.142545 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:32.142524771 +0000 UTC m=+34.104289776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:16.343832 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:16.343794 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:16.343996 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:16.343908 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:16.343996 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:16.343922 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:16.343996 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:16.343931 2566 projected.go:194] Error preparing data for projected volume kube-api-access-b75lj for pod openshift-network-diagnostics/network-check-target-tpck9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:16.343996 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:16.343984 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj podName:9e66f6e5-929b-4807-810a-ad84e15bb98f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:32.343963439 +0000 UTC m=+34.305728451 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-b75lj" (UniqueName: "kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj") pod "network-check-target-tpck9" (UID: "9e66f6e5-929b-4807-810a-ad84e15bb98f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:16.534827 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:16.534781 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:16.535271 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:16.534923 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:17.534215 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:17.534183 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:17.534381 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:17.534298 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:18.536279 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.535888 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:18.536876 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:18.536628 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:18.694240 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.694205 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-88hpr" event={"ID":"4ca03565-9d72-4813-9179-7636908c9bf5","Type":"ContainerStarted","Data":"14bff7d1646203a92ae72424fd5c990781aabba443edea252d9eb63459c7862a"} Apr 24 21:29:18.695500 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.695476 2566 generic.go:358] "Generic (PLEG): container finished" podID="126ae8c8-5c56-4044-b72b-57fc091713c4" containerID="04d163886c17a5022a633bf6b36b580d5ba1a876c94b21b4bc87ee8c4c7233e7" exitCode=0 Apr 24 21:29:18.695606 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.695536 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" event={"ID":"126ae8c8-5c56-4044-b72b-57fc091713c4","Type":"ContainerDied","Data":"04d163886c17a5022a633bf6b36b580d5ba1a876c94b21b4bc87ee8c4c7233e7"} Apr 24 21:29:18.696824 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.696744 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4rz25" event={"ID":"918a384b-26d7-496e-b3c9-370b6e526ebd","Type":"ContainerStarted","Data":"cce5a3faed34f6580a0f83a9b62f8b5b2f00414b69221e7928c7f6f18de982df"} Apr 24 21:29:18.697953 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.697930 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" event={"ID":"ce3cc5f8-62fb-41a2-9be9-2e22c637379e","Type":"ContainerStarted","Data":"141be3943f0af8c8d167f5ecb62f76fe07f86a6ad059d0965d4f7f007268f93c"} Apr 24 21:29:18.699327 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.699305 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2qsqx" event={"ID":"e9028851-049a-4814-809a-8ffbb08d8ce7","Type":"ContainerStarted","Data":"865e147ab9bf3c9c8fb423a2da98d791f87ec2faaa14eba6344315b877e00a9e"} Apr 24 21:29:18.702194 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.702175 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:29:18.702505 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.702489 2566 generic.go:358] "Generic (PLEG): container finished" podID="5a001c67-cdac-4386-8012-1386fdcf8bbd" containerID="62a32ae495a310c897b01204ca4bc82442d8bf376252c5b306b6b92f93794ed3" exitCode=1 Apr 24 21:29:18.702569 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.702547 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"43a3de999428cebafa7e5c1122c7d6f0a9c9264ee06a153b8b953890e474b3bb"} Apr 24 21:29:18.702615 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.702568 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"92021c54e7afc0e01e9a191ac0cce22bacf9990f0e74b820ae7a5f1dd8933ea0"} Apr 24 21:29:18.702615 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.702581 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"d992cd9ec0ed06b40b5b9aba6c5381357419fae9a52515754daff39155a5999b"} Apr 24 21:29:18.702615 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.702593 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"883fd2f4ef36d5a8bf2e3336118d89d48e840c2d88baee1ab4452b0ded8e9ce2"} Apr 24 21:29:18.702615 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.702606 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerDied","Data":"62a32ae495a310c897b01204ca4bc82442d8bf376252c5b306b6b92f93794ed3"} Apr 24 21:29:18.702771 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.702620 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"aeef141fe4b273005c6fbc897f4e7f4e995eaa685d6f5aff9e8d05a53d895380"} Apr 24 21:29:18.703709 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.703690 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n6qzm" event={"ID":"55ef112d-fe31-4b57-808d-d33898e3e457","Type":"ContainerStarted","Data":"5d3271247056b5653c835d677b9a6e5b748e880093ae41bdaf44ef308c7e434d"} Apr 24 21:29:18.704922 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.704903 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" event={"ID":"e2991f6d-3be3-4ee8-afa1-f01aef64092a","Type":"ContainerStarted","Data":"df11fe1930388c3a0db806e386d5bdcde2cb5ca4389fc6b15e13982caa2275ed"} Apr 24 21:29:18.710740 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.710702 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-27.ec2.internal" podStartSLOduration=19.710691867 podStartE2EDuration="19.710691867s" podCreationTimestamp="2026-04-24 21:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:03.686038906 +0000 UTC m=+5.647803927" watchObservedRunningTime="2026-04-24 21:29:18.710691867 +0000 UTC m=+20.672456887" Apr 24 21:29:18.711077 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.711054 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-88hpr" podStartSLOduration=4.002517324 podStartE2EDuration="20.711049731s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.130717314 +0000 UTC m=+3.092482316" lastFinishedPulling="2026-04-24 21:29:17.839249712 +0000 UTC m=+19.801014723" observedRunningTime="2026-04-24 21:29:18.710691135 +0000 UTC m=+20.672456155" watchObservedRunningTime="2026-04-24 21:29:18.711049731 +0000 UTC m=+20.672814751" Apr 24 21:29:18.728637 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.728598 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2qsqx" podStartSLOduration=11.771145933 podStartE2EDuration="20.728585406s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.127515124 +0000 UTC m=+3.089280128" lastFinishedPulling="2026-04-24 21:29:10.084954588 +0000 UTC m=+12.046719601" observedRunningTime="2026-04-24 21:29:18.728170657 +0000 UTC m=+20.689935689" watchObservedRunningTime="2026-04-24 21:29:18.728585406 +0000 UTC m=+20.690350426" Apr 24 21:29:18.744260 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.744215 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4rz25" podStartSLOduration=4.029423121 podStartE2EDuration="20.744201099s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.129842658 +0000 UTC m=+3.091607661" lastFinishedPulling="2026-04-24 21:29:17.844620624 +0000 UTC m=+19.806385639" observedRunningTime="2026-04-24 21:29:18.743918672 +0000 UTC m=+20.705683702" watchObservedRunningTime="2026-04-24 21:29:18.744201099 +0000 UTC m=+20.705966121" Apr 24 21:29:18.769740 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.769687 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ltjjp" podStartSLOduration=4.047871398 podStartE2EDuration="20.769670819s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.125688418 +0000 UTC m=+3.087453415" lastFinishedPulling="2026-04-24 21:29:17.847487823 +0000 UTC m=+19.809252836" observedRunningTime="2026-04-24 21:29:18.768832663 +0000 UTC m=+20.730597682" watchObservedRunningTime="2026-04-24 21:29:18.769670819 +0000 UTC m=+20.731435840" Apr 24 21:29:18.789090 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:18.789043 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n6qzm" podStartSLOduration=4.030466896 podStartE2EDuration="20.789029548s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.133849824 +0000 UTC m=+3.095614829" lastFinishedPulling="2026-04-24 21:29:17.89241247 +0000 UTC m=+19.854177481" observedRunningTime="2026-04-24 21:29:18.788539744 +0000 UTC m=+20.750304763" watchObservedRunningTime="2026-04-24 21:29:18.789029548 +0000 UTC m=+20.750794569" Apr 24 21:29:19.270053 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:19.270024 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:19.270651 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:19.270629 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:19.494590 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:19.494567 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:29:19.534423 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:19.534398 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:19.534545 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:19.534526 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:19.708580 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:19.708507 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" event={"ID":"e2991f6d-3be3-4ee8-afa1-f01aef64092a","Type":"ContainerStarted","Data":"59c8e02dfbe43f134353efbf8e9f32dd65c100fc61bbb7cc43424597cf2cce85"} Apr 24 21:29:19.710028 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:19.709987 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p55z9" event={"ID":"82cb470a-4ce5-4007-a453-0ac73804ef24","Type":"ContainerStarted","Data":"3984660e917db0798094993ed3937bbcf40cc3181cd640c8b87096b2332b8d8f"} Apr 24 21:29:19.729749 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:19.729706 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-p55z9" podStartSLOduration=5.026359322 podStartE2EDuration="21.729694298s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.131881534 +0000 UTC m=+3.093646531" lastFinishedPulling="2026-04-24 21:29:17.835216502 +0000 UTC m=+19.796981507" observedRunningTime="2026-04-24 21:29:19.729413073 +0000 UTC m=+21.691178094" watchObservedRunningTime="2026-04-24 21:29:19.729694298 +0000 UTC m=+21.691459317" Apr 24 21:29:20.474411 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:20.474284 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:29:19.494585986Z","UUID":"08a9785b-abd7-4b5f-b6c7-b897b3aad072","Handler":null,"Name":"","Endpoint":""} Apr 24 21:29:20.477673 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:20.477651 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:29:20.477673 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:20.477680 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:29:20.534652 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:20.534625 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:20.534797 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:20.534739 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:20.711581 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:20.711556 2566 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:21.534300 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:21.534093 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:21.534466 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:21.534426 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:21.716977 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:21.716947 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:29:21.717412 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:21.717382 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"8ebee2d5d860677352f065489e4f15188a8e5be272dae1d3e916d30e3651b632"} Apr 24 21:29:21.719393 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:21.719367 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" event={"ID":"e2991f6d-3be3-4ee8-afa1-f01aef64092a","Type":"ContainerStarted","Data":"92e501fb26e089f208ea70abb65eec68674ba4dbd854a6e5307d2a242cafbdb8"} Apr 24 21:29:21.747441 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:21.747388 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-z4m8g" podStartSLOduration=4.256884046 podStartE2EDuration="23.747374947s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.133526518 +0000 UTC m=+3.095291532" lastFinishedPulling="2026-04-24 21:29:20.624017421 +0000 UTC m=+22.585782433" observedRunningTime="2026-04-24 21:29:21.746957575 +0000 UTC m=+23.708722594" watchObservedRunningTime="2026-04-24 21:29:21.747374947 +0000 UTC m=+23.709140005" Apr 24 21:29:22.534656 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:22.534628 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:22.534823 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:22.534762 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:23.534678 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.534520 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:23.535275 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:23.534759 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:23.727778 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.727755 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:29:23.728135 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.728105 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"a72b7d1a2c3ea15b101ae21d924174c50572db9f843e88a61b53f4fe8a51ef4f"} Apr 24 21:29:23.728423 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.728395 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:23.728523 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.728435 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:23.728609 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.728588 2566 scope.go:117] "RemoveContainer" containerID="62a32ae495a310c897b01204ca4bc82442d8bf376252c5b306b6b92f93794ed3" Apr 24 21:29:23.729954 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.729935 2566 generic.go:358] "Generic (PLEG): container finished" podID="126ae8c8-5c56-4044-b72b-57fc091713c4" containerID="8f338118ed927f1fa2708bde71d782d248c6513f5289399826b05b4ad34f53a5" exitCode=0 Apr 24 21:29:23.730025 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.729972 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" event={"ID":"126ae8c8-5c56-4044-b72b-57fc091713c4","Type":"ContainerDied","Data":"8f338118ed927f1fa2708bde71d782d248c6513f5289399826b05b4ad34f53a5"} Apr 24 21:29:23.742847 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.742826 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:23.934909 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.934882 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:23.935021 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.934992 2566 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:23.935402 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:23.935386 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2qsqx" Apr 24 21:29:24.534501 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.534468 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:24.534644 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:24.534591 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:24.738906 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.738879 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:29:24.739411 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.739383 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" event={"ID":"5a001c67-cdac-4386-8012-1386fdcf8bbd","Type":"ContainerStarted","Data":"2c8c49836cc4953a973c92ded9eccc51e8af895b79bafad2ebd9d72bfd126a74"} Apr 24 21:29:24.740064 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.739983 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:24.754660 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.754639 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:24.773273 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.773232 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" podStartSLOduration=9.977970308 podStartE2EDuration="26.773220415s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.124943055 +0000 UTC m=+3.086708066" lastFinishedPulling="2026-04-24 21:29:17.920193161 +0000 UTC m=+19.881958173" observedRunningTime="2026-04-24 21:29:24.771798633 +0000 UTC m=+26.733563650" watchObservedRunningTime="2026-04-24 21:29:24.773220415 +0000 UTC m=+26.734985464" Apr 24 21:29:24.900262 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.900185 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tpck9"] Apr 24 21:29:24.900404 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.900295 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:24.900404 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:24.900384 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:24.903024 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.903000 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q52j5"] Apr 24 21:29:24.903126 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:24.903119 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:24.903271 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:24.903237 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:25.742614 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:25.742585 2566 generic.go:358] "Generic (PLEG): container finished" podID="126ae8c8-5c56-4044-b72b-57fc091713c4" containerID="d9b68289b79ec8d9a62237ffe74362099a2d1a15d1c18cdebfc024c8a6ace116" exitCode=0 Apr 24 21:29:25.743010 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:25.742673 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" event={"ID":"126ae8c8-5c56-4044-b72b-57fc091713c4","Type":"ContainerDied","Data":"d9b68289b79ec8d9a62237ffe74362099a2d1a15d1c18cdebfc024c8a6ace116"} Apr 24 21:29:26.534448 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:26.534418 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:26.534607 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:26.534421 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:26.534607 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:26.534519 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:26.534607 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:26.534581 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:27.748881 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:27.748681 2566 generic.go:358] "Generic (PLEG): container finished" podID="126ae8c8-5c56-4044-b72b-57fc091713c4" containerID="26fc22772075e5a5928f44e4c0b70330629c194dfb424b105146d1d38ed34c2a" exitCode=0 Apr 24 21:29:27.749289 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:27.748760 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" event={"ID":"126ae8c8-5c56-4044-b72b-57fc091713c4","Type":"ContainerDied","Data":"26fc22772075e5a5928f44e4c0b70330629c194dfb424b105146d1d38ed34c2a"} Apr 24 21:29:28.535152 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:28.535124 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:28.535333 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:28.535233 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:28.535333 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:28.535321 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:28.535481 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:28.535463 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:30.534274 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.534247 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:30.534677 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:30.534388 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tpck9" podUID="9e66f6e5-929b-4807-810a-ad84e15bb98f" Apr 24 21:29:30.534677 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.534443 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:30.534677 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:30.534546 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:29:30.849245 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.849171 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-27.ec2.internal" event="NodeReady" Apr 24 21:29:30.849414 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.849310 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:29:30.893529 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.893498 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6jxzc"] Apr 24 21:29:30.909786 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.909759 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z2pdw"] Apr 24 21:29:30.921449 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.921415 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6jxzc"] Apr 24 21:29:30.921582 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.921536 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:30.921582 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.921540 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:30.921980 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.921946 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z2pdw"] Apr 24 21:29:30.923652 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.923631 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:29:30.923831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.923711 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kxf55\"" Apr 24 21:29:30.924365 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.924089 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:29:30.924365 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.924102 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:29:30.924365 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.924172 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:29:30.924365 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.924268 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sbjvw\"" Apr 24 21:29:30.924605 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:30.924406 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:29:31.050496 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.050470 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.050496 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.050500 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-tmp-dir\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.050685 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.050525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr7mf\" (UniqueName: \"kubernetes.io/projected/cc70ee65-8f01-4bad-adc7-98b5e7037c77-kube-api-access-wr7mf\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:31.050685 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.050582 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-config-volume\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.050685 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.050624 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:31.050685 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.050658 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfchk\" (UniqueName: \"kubernetes.io/projected/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-kube-api-access-gfchk\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.151710 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.151629 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:31.151710 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.151684 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfchk\" (UniqueName: \"kubernetes.io/projected/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-kube-api-access-gfchk\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.151923 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.151717 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.151923 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.151745 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-tmp-dir\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.151923 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.151778 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr7mf\" (UniqueName: \"kubernetes.io/projected/cc70ee65-8f01-4bad-adc7-98b5e7037c77-kube-api-access-wr7mf\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:31.151923 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:31.151793 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:31.151923 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.151845 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-config-volume\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.151923 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:31.151853 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert podName:cc70ee65-8f01-4bad-adc7-98b5e7037c77 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:31.651834227 +0000 UTC m=+33.613599226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert") pod "ingress-canary-z2pdw" (UID: "cc70ee65-8f01-4bad-adc7-98b5e7037c77") : secret "canary-serving-cert" not found Apr 24 21:29:31.151923 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:31.151861 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:31.151923 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:31.151916 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls podName:e77a2c8d-6381-4990-b43d-cfa87c8c3fc4 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:31.651899324 +0000 UTC m=+33.613664325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls") pod "dns-default-6jxzc" (UID: "e77a2c8d-6381-4990-b43d-cfa87c8c3fc4") : secret "dns-default-metrics-tls" not found Apr 24 21:29:31.152345 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.152241 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-tmp-dir\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.152469 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.152451 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-config-volume\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.162258 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.162237 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfchk\" (UniqueName: \"kubernetes.io/projected/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-kube-api-access-gfchk\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.162389 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.162337 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr7mf\" (UniqueName: \"kubernetes.io/projected/cc70ee65-8f01-4bad-adc7-98b5e7037c77-kube-api-access-wr7mf\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:31.656220 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.656169 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:31.656876 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:31.656261 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:31.656876 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:31.656377 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:31.656876 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:31.656455 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls podName:e77a2c8d-6381-4990-b43d-cfa87c8c3fc4 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:32.656434944 +0000 UTC m=+34.618199960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls") pod "dns-default-6jxzc" (UID: "e77a2c8d-6381-4990-b43d-cfa87c8c3fc4") : secret "dns-default-metrics-tls" not found Apr 24 21:29:31.656876 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:31.656380 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:31.656876 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:31.656528 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert podName:cc70ee65-8f01-4bad-adc7-98b5e7037c77 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:32.656513238 +0000 UTC m=+34.618278240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert") pod "ingress-canary-z2pdw" (UID: "cc70ee65-8f01-4bad-adc7-98b5e7037c77") : secret "canary-serving-cert" not found Apr 24 21:29:32.160422 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.160369 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:32.160686 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.160546 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:32.160686 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.160627 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:04.160606513 +0000 UTC m=+66.122371517 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:32.361343 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.361308 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:32.361544 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.361455 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:32.361544 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.361476 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:32.361544 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.361487 2566 projected.go:194] Error preparing data for projected volume kube-api-access-b75lj for pod openshift-network-diagnostics/network-check-target-tpck9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:32.361544 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.361543 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj podName:9e66f6e5-929b-4807-810a-ad84e15bb98f nodeName:}" failed. No retries permitted until 2026-04-24 21:30:04.361528756 +0000 UTC m=+66.323293754 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-b75lj" (UniqueName: "kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj") pod "network-check-target-tpck9" (UID: "9e66f6e5-929b-4807-810a-ad84e15bb98f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:32.534873 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.534842 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:29:32.535059 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.534843 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:29:32.537631 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.537609 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:29:32.538662 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.538636 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p6sxw\"" Apr 24 21:29:32.538779 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.538678 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-flqk8\"" Apr 24 21:29:32.538779 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.538678 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:29:32.538779 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.538680 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:29:32.664078 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.664045 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:32.664549 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:32.664116 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:32.664549 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.664211 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:32.664659 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.664636 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:32.664709 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.664658 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert podName:cc70ee65-8f01-4bad-adc7-98b5e7037c77 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:34.664512797 +0000 UTC m=+36.626277803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert") pod "ingress-canary-z2pdw" (UID: "cc70ee65-8f01-4bad-adc7-98b5e7037c77") : secret "canary-serving-cert" not found Apr 24 21:29:32.666978 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:32.664918 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls podName:e77a2c8d-6381-4990-b43d-cfa87c8c3fc4 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:34.664900529 +0000 UTC m=+36.626665544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls") pod "dns-default-6jxzc" (UID: "e77a2c8d-6381-4990-b43d-cfa87c8c3fc4") : secret "dns-default-metrics-tls" not found Apr 24 21:29:34.680252 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:34.680220 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:34.680717 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:34.680268 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:34.680717 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:34.680379 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:34.680717 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:34.680431 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls podName:e77a2c8d-6381-4990-b43d-cfa87c8c3fc4 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:38.680417781 +0000 UTC m=+40.642182779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls") pod "dns-default-6jxzc" (UID: "e77a2c8d-6381-4990-b43d-cfa87c8c3fc4") : secret "dns-default-metrics-tls" not found Apr 24 21:29:34.680717 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:34.680379 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:34.680717 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:34.680500 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert podName:cc70ee65-8f01-4bad-adc7-98b5e7037c77 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:38.680488033 +0000 UTC m=+40.642253031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert") pod "ingress-canary-z2pdw" (UID: "cc70ee65-8f01-4bad-adc7-98b5e7037c77") : secret "canary-serving-cert" not found Apr 24 21:29:34.765857 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:34.765831 2566 generic.go:358] "Generic (PLEG): container finished" podID="126ae8c8-5c56-4044-b72b-57fc091713c4" containerID="d4babc7b0094c550b3b714e737cdba51f4d12e4826a875b70432c4df40a3a3c6" exitCode=0 Apr 24 21:29:34.766015 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:34.765890 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" event={"ID":"126ae8c8-5c56-4044-b72b-57fc091713c4","Type":"ContainerDied","Data":"d4babc7b0094c550b3b714e737cdba51f4d12e4826a875b70432c4df40a3a3c6"} Apr 24 21:29:35.770432 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:35.770399 2566 generic.go:358] "Generic (PLEG): container finished" podID="126ae8c8-5c56-4044-b72b-57fc091713c4" containerID="00747f0c5b639fc441f84cd05006160e6fa7910d255bae341804e92d35ef53ab" exitCode=0 Apr 24 21:29:35.770778 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:35.770456 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" event={"ID":"126ae8c8-5c56-4044-b72b-57fc091713c4","Type":"ContainerDied","Data":"00747f0c5b639fc441f84cd05006160e6fa7910d255bae341804e92d35ef53ab"} Apr 24 21:29:36.777396 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:36.777344 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" event={"ID":"126ae8c8-5c56-4044-b72b-57fc091713c4","Type":"ContainerStarted","Data":"28e339f4a1834229fe12e9412c98ba72e5ab231c6dc50029031043ab398fc9c4"} Apr 24 21:29:36.803752 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:36.803625 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m5nxz" podStartSLOduration=6.313971357 podStartE2EDuration="38.80360892s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:01.136591 +0000 UTC m=+3.098356002" lastFinishedPulling="2026-04-24 21:29:33.626228568 +0000 UTC m=+35.587993565" observedRunningTime="2026-04-24 21:29:36.802944868 +0000 UTC m=+38.764709889" watchObservedRunningTime="2026-04-24 21:29:36.80360892 +0000 UTC m=+38.765373941" Apr 24 21:29:38.706925 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:38.706894 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:38.707316 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:38.706939 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:38.707316 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:38.707031 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:38.707316 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:38.707034 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:38.707316 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:38.707078 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls podName:e77a2c8d-6381-4990-b43d-cfa87c8c3fc4 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:46.707065042 +0000 UTC m=+48.668830040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls") pod "dns-default-6jxzc" (UID: "e77a2c8d-6381-4990-b43d-cfa87c8c3fc4") : secret "dns-default-metrics-tls" not found Apr 24 21:29:38.707316 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:38.707091 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert podName:cc70ee65-8f01-4bad-adc7-98b5e7037c77 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:46.70708478 +0000 UTC m=+48.668849778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert") pod "ingress-canary-z2pdw" (UID: "cc70ee65-8f01-4bad-adc7-98b5e7037c77") : secret "canary-serving-cert" not found Apr 24 21:29:46.759491 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:46.759458 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:29:46.759946 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:46.759518 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:29:46.759946 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:46.759617 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:46.759946 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:46.759681 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert podName:cc70ee65-8f01-4bad-adc7-98b5e7037c77 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:02.75966686 +0000 UTC m=+64.721431859 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert") pod "ingress-canary-z2pdw" (UID: "cc70ee65-8f01-4bad-adc7-98b5e7037c77") : secret "canary-serving-cert" not found Apr 24 21:29:46.759946 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:46.759617 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:46.759946 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:29:46.759749 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls podName:e77a2c8d-6381-4990-b43d-cfa87c8c3fc4 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:02.75973803 +0000 UTC m=+64.721503033 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls") pod "dns-default-6jxzc" (UID: "e77a2c8d-6381-4990-b43d-cfa87c8c3fc4") : secret "dns-default-metrics-tls" not found Apr 24 21:29:56.353040 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.353006 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl"] Apr 24 21:29:56.396572 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.396544 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl"] Apr 24 21:29:56.396572 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.396570 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq"] Apr 24 21:29:56.396752 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.396683 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.399670 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.399643 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:29:56.399930 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.399906 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:29:56.400941 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.400917 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:29:56.401179 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.401150 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:29:56.411489 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.411469 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq"] Apr 24 21:29:56.411586 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.411562 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" Apr 24 21:29:56.414030 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.414012 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-gfffx\"" Apr 24 21:29:56.414149 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.414066 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:29:56.521894 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.521869 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.521894 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.521897 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96cgt\" (UniqueName: \"kubernetes.io/projected/81a79417-8450-430b-abf3-c34740b1e420-kube-api-access-96cgt\") pod \"managed-serviceaccount-addon-agent-c456c5c55-czvgq\" (UID: \"81a79417-8450-430b-abf3-c34740b1e420\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" Apr 24 21:29:56.522044 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.521929 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqc9z\" (UniqueName: \"kubernetes.io/projected/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-kube-api-access-hqc9z\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.522044 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.522010 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/81a79417-8450-430b-abf3-c34740b1e420-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c456c5c55-czvgq\" (UID: \"81a79417-8450-430b-abf3-c34740b1e420\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" Apr 24 21:29:56.522044 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.522042 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-tmp\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.622867 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.622810 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/81a79417-8450-430b-abf3-c34740b1e420-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c456c5c55-czvgq\" (UID: \"81a79417-8450-430b-abf3-c34740b1e420\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" Apr 24 21:29:56.622867 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.622841 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-tmp\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.622998 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.622869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.622998 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.622887 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96cgt\" (UniqueName: \"kubernetes.io/projected/81a79417-8450-430b-abf3-c34740b1e420-kube-api-access-96cgt\") pod \"managed-serviceaccount-addon-agent-c456c5c55-czvgq\" (UID: \"81a79417-8450-430b-abf3-c34740b1e420\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" Apr 24 21:29:56.622998 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.622923 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqc9z\" (UniqueName: \"kubernetes.io/projected/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-kube-api-access-hqc9z\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.623270 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.623246 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-tmp\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.626226 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.626197 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.626226 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.626206 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/81a79417-8450-430b-abf3-c34740b1e420-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c456c5c55-czvgq\" (UID: \"81a79417-8450-430b-abf3-c34740b1e420\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" Apr 24 21:29:56.634706 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.634682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqc9z\" (UniqueName: \"kubernetes.io/projected/63ae173b-e91b-4aa4-9d26-c6f559d0dffd-kube-api-access-hqc9z\") pod \"klusterlet-addon-workmgr-6cbcc68589-kxztl\" (UID: \"63ae173b-e91b-4aa4-9d26-c6f559d0dffd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.635196 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.635179 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96cgt\" (UniqueName: \"kubernetes.io/projected/81a79417-8450-430b-abf3-c34740b1e420-kube-api-access-96cgt\") pod \"managed-serviceaccount-addon-agent-c456c5c55-czvgq\" (UID: \"81a79417-8450-430b-abf3-c34740b1e420\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" Apr 24 21:29:56.707869 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.707848 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:29:56.730604 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.730577 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" Apr 24 21:29:56.760047 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.760025 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qccqq" Apr 24 21:29:56.907738 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.907671 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq"] Apr 24 21:29:56.908402 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:56.908377 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl"] Apr 24 21:29:56.911341 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:56.911311 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ae173b_e91b_4aa4_9d26_c6f559d0dffd.slice/crio-41780600d2dd2b392040bcb0c7c82580ade4208e79b3c438acdc8a6968e3d822 WatchSource:0}: Error finding container 41780600d2dd2b392040bcb0c7c82580ade4208e79b3c438acdc8a6968e3d822: Status 404 returned error can't find the container with id 41780600d2dd2b392040bcb0c7c82580ade4208e79b3c438acdc8a6968e3d822 Apr 24 21:29:56.911819 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:29:56.911798 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81a79417_8450_430b_abf3_c34740b1e420.slice/crio-88e2ba0a71f14035b3d22a27b1cda15e215a2f464033370853d3aa6dcdac92b3 WatchSource:0}: Error finding container 88e2ba0a71f14035b3d22a27b1cda15e215a2f464033370853d3aa6dcdac92b3: Status 404 returned error can't find the container with id 88e2ba0a71f14035b3d22a27b1cda15e215a2f464033370853d3aa6dcdac92b3 Apr 24 21:29:57.829268 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:57.829231 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" event={"ID":"81a79417-8450-430b-abf3-c34740b1e420","Type":"ContainerStarted","Data":"88e2ba0a71f14035b3d22a27b1cda15e215a2f464033370853d3aa6dcdac92b3"} Apr 24 21:29:57.832059 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:29:57.831988 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" event={"ID":"63ae173b-e91b-4aa4-9d26-c6f559d0dffd","Type":"ContainerStarted","Data":"41780600d2dd2b392040bcb0c7c82580ade4208e79b3c438acdc8a6968e3d822"} Apr 24 21:30:01.841475 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:01.841430 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" event={"ID":"81a79417-8450-430b-abf3-c34740b1e420","Type":"ContainerStarted","Data":"3a9b38d70f3f9eb1242c3be736351f3fe04d82a2f2981d07d42efd1fbc0cd13f"} Apr 24 21:30:01.842623 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:01.842601 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" event={"ID":"63ae173b-e91b-4aa4-9d26-c6f559d0dffd","Type":"ContainerStarted","Data":"652c4c4138eb24958ce5bc5b28558ad2334d635c4e08f1b54190992c077a6bb9"} Apr 24 21:30:01.842825 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:01.842808 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:30:01.844447 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:01.844428 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" Apr 24 21:30:01.858873 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:01.858832 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c456c5c55-czvgq" podStartSLOduration=3.055745254 podStartE2EDuration="5.85882128s" podCreationTimestamp="2026-04-24 21:29:56 +0000 UTC" firstStartedPulling="2026-04-24 21:29:56.91357183 +0000 UTC m=+58.875336828" lastFinishedPulling="2026-04-24 21:29:59.716647852 +0000 UTC m=+61.678412854" observedRunningTime="2026-04-24 21:30:01.857233494 +0000 UTC m=+63.818998538" watchObservedRunningTime="2026-04-24 21:30:01.85882128 +0000 UTC m=+63.820586320" Apr 24 21:30:01.873826 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:01.873793 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cbcc68589-kxztl" podStartSLOduration=1.860221053 podStartE2EDuration="5.8737814s" podCreationTimestamp="2026-04-24 21:29:56 +0000 UTC" firstStartedPulling="2026-04-24 21:29:56.913055322 +0000 UTC m=+58.874820323" lastFinishedPulling="2026-04-24 21:30:00.926615671 +0000 UTC m=+62.888380670" observedRunningTime="2026-04-24 21:30:01.873097267 +0000 UTC m=+63.834862309" watchObservedRunningTime="2026-04-24 21:30:01.8737814 +0000 UTC m=+63.835546421" Apr 24 21:30:02.772490 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:02.772451 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:30:02.772661 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:02.772505 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:30:02.772661 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:02.772595 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:30:02.772661 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:02.772598 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:30:02.772661 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:02.772643 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls podName:e77a2c8d-6381-4990-b43d-cfa87c8c3fc4 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:34.772631248 +0000 UTC m=+96.734396246 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls") pod "dns-default-6jxzc" (UID: "e77a2c8d-6381-4990-b43d-cfa87c8c3fc4") : secret "dns-default-metrics-tls" not found Apr 24 21:30:02.772661 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:02.772661 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert podName:cc70ee65-8f01-4bad-adc7-98b5e7037c77 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:34.772648488 +0000 UTC m=+96.734413486 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert") pod "ingress-canary-z2pdw" (UID: "cc70ee65-8f01-4bad-adc7-98b5e7037c77") : secret "canary-serving-cert" not found Apr 24 21:30:04.180867 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.180827 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:30:04.183488 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.183470 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:30:04.191714 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:04.191697 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:30:04.191768 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:04.191750 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:08.191736166 +0000 UTC m=+130.153501163 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : secret "metrics-daemon-secret" not found Apr 24 21:30:04.383108 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.383079 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:30:04.385710 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.385692 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:30:04.397003 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.396984 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:30:04.406984 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.406963 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b75lj\" (UniqueName: \"kubernetes.io/projected/9e66f6e5-929b-4807-810a-ad84e15bb98f-kube-api-access-b75lj\") pod \"network-check-target-tpck9\" (UID: \"9e66f6e5-929b-4807-810a-ad84e15bb98f\") " pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:30:04.654939 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.654917 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p6sxw\"" Apr 24 21:30:04.663112 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.663095 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:30:04.796344 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.796317 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tpck9"] Apr 24 21:30:04.798999 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:30:04.798971 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e66f6e5_929b_4807_810a_ad84e15bb98f.slice/crio-9b276dac7a169b13debad989c7d0d9a637812fbef1c87b3642421836f64619af WatchSource:0}: Error finding container 9b276dac7a169b13debad989c7d0d9a637812fbef1c87b3642421836f64619af: Status 404 returned error can't find the container with id 9b276dac7a169b13debad989c7d0d9a637812fbef1c87b3642421836f64619af Apr 24 21:30:04.848141 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:04.848113 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tpck9" event={"ID":"9e66f6e5-929b-4807-810a-ad84e15bb98f","Type":"ContainerStarted","Data":"9b276dac7a169b13debad989c7d0d9a637812fbef1c87b3642421836f64619af"} Apr 24 21:30:07.854783 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:07.854752 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tpck9" event={"ID":"9e66f6e5-929b-4807-810a-ad84e15bb98f","Type":"ContainerStarted","Data":"faad5281d029852176cef915377e19ff82fe1dcf4f8d327019746a40e945bd17"} Apr 24 21:30:07.855180 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:07.854900 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:30:07.875227 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:07.875189 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tpck9" podStartSLOduration=67.226631054 podStartE2EDuration="1m9.87517684s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:30:04.80094177 +0000 UTC m=+66.762706775" lastFinishedPulling="2026-04-24 21:30:07.449487549 +0000 UTC m=+69.411252561" observedRunningTime="2026-04-24 21:30:07.874714919 +0000 UTC m=+69.836479940" watchObservedRunningTime="2026-04-24 21:30:07.87517684 +0000 UTC m=+69.836941861" Apr 24 21:30:34.869625 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:34.869597 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:30:34.870024 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:34.869649 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:30:34.870024 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:34.869744 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:30:34.870024 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:34.869809 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls podName:e77a2c8d-6381-4990-b43d-cfa87c8c3fc4 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:38.869794922 +0000 UTC m=+160.831559921 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls") pod "dns-default-6jxzc" (UID: "e77a2c8d-6381-4990-b43d-cfa87c8c3fc4") : secret "dns-default-metrics-tls" not found Apr 24 21:30:34.870024 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:34.869753 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:30:34.870024 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:30:34.869880 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert podName:cc70ee65-8f01-4bad-adc7-98b5e7037c77 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:38.869868669 +0000 UTC m=+160.831633667 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert") pod "ingress-canary-z2pdw" (UID: "cc70ee65-8f01-4bad-adc7-98b5e7037c77") : secret "canary-serving-cert" not found Apr 24 21:30:38.859207 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:30:38.859174 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tpck9" Apr 24 21:31:08.283640 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:08.283596 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:31:08.284085 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:08.283716 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:31:08.284085 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:08.283781 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs podName:186d543e-d0f4-4e11-ac28-e8ebb35c72a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:10.283764084 +0000 UTC m=+252.245529101 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs") pod "network-metrics-daemon-q52j5" (UID: "186d543e-d0f4-4e11-ac28-e8ebb35c72a2") : secret "metrics-daemon-secret" not found Apr 24 21:31:13.658992 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.658954 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rwc2z"] Apr 24 21:31:13.661676 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.661661 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g"] Apr 24 21:31:13.661814 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.661796 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.664246 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.664224 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj"] Apr 24 21:31:13.664365 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.664335 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.664544 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.664521 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.664669 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.664620 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:31:13.664923 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.664906 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-nf5l4\"" Apr 24 21:31:13.665078 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.665065 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:31:13.665458 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.665441 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.666961 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.666944 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:31:13.667145 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.667128 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.667308 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.667282 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.667308 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.667197 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:31:13.667573 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.667554 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:13.668086 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.668067 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-kf247\"" Apr 24 21:31:13.670244 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.670212 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:31:13.670375 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.670248 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.670375 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.670299 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-2fg8f\"" Apr 24 21:31:13.670375 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.670333 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:31:13.670375 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.670218 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.673312 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.673293 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g"] Apr 24 21:31:13.674480 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.674460 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rwc2z"] Apr 24 21:31:13.675170 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.675152 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj"] Apr 24 21:31:13.717069 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717044 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5da9727b-1328-4d04-8ab2-cda650280a23-config\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.717069 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717069 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.717202 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717109 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5da9727b-1328-4d04-8ab2-cda650280a23-trusted-ca\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.717202 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717128 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2cwt\" (UniqueName: \"kubernetes.io/projected/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-kube-api-access-t2cwt\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.717202 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717148 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4ljq\" (UniqueName: \"kubernetes.io/projected/5da9727b-1328-4d04-8ab2-cda650280a23-kube-api-access-r4ljq\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.717202 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717179 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q2vx\" (UniqueName: \"kubernetes.io/projected/acc9bae7-d9ba-4805-b6a8-4148ae510868-kube-api-access-9q2vx\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:13.717324 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717233 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5da9727b-1328-4d04-8ab2-cda650280a23-serving-cert\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.717324 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717254 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:13.717324 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.717272 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-config\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.818339 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818311 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5da9727b-1328-4d04-8ab2-cda650280a23-config\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.818339 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.818513 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818392 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5da9727b-1328-4d04-8ab2-cda650280a23-trusted-ca\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.818513 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818409 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2cwt\" (UniqueName: \"kubernetes.io/projected/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-kube-api-access-t2cwt\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.818513 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818428 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4ljq\" (UniqueName: \"kubernetes.io/projected/5da9727b-1328-4d04-8ab2-cda650280a23-kube-api-access-r4ljq\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.818513 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818445 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q2vx\" (UniqueName: \"kubernetes.io/projected/acc9bae7-d9ba-4805-b6a8-4148ae510868-kube-api-access-9q2vx\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:13.818707 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818683 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5da9727b-1328-4d04-8ab2-cda650280a23-serving-cert\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.818767 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:13.818819 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.818777 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-config\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.818867 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:13.818853 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:31:13.818939 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:13.818925 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls podName:acc9bae7-d9ba-4805-b6a8-4148ae510868 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:14.318903777 +0000 UTC m=+136.280668792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gwpbj" (UID: "acc9bae7-d9ba-4805-b6a8-4148ae510868") : secret "samples-operator-tls" not found Apr 24 21:31:13.819091 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.819072 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5da9727b-1328-4d04-8ab2-cda650280a23-config\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.819236 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.819211 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5da9727b-1328-4d04-8ab2-cda650280a23-trusted-ca\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.819330 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.819311 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-config\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.820721 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.820702 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.820875 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.820857 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5da9727b-1328-4d04-8ab2-cda650280a23-serving-cert\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.829261 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.829241 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4ljq\" (UniqueName: \"kubernetes.io/projected/5da9727b-1328-4d04-8ab2-cda650280a23-kube-api-access-r4ljq\") pod \"console-operator-9d4b6777b-rwc2z\" (UID: \"5da9727b-1328-4d04-8ab2-cda650280a23\") " pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.829436 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.829419 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2cwt\" (UniqueName: \"kubernetes.io/projected/993d1bcd-a92a-4fbb-b948-a5b5eeca31e8-kube-api-access-t2cwt\") pod \"service-ca-operator-d6fc45fc5-9jd9g\" (UID: \"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:13.829688 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.829667 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q2vx\" (UniqueName: \"kubernetes.io/projected/acc9bae7-d9ba-4805-b6a8-4148ae510868-kube-api-access-9q2vx\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:13.972181 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.972125 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:13.978643 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:13.978622 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" Apr 24 21:31:14.101654 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:14.101624 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rwc2z"] Apr 24 21:31:14.104490 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:14.104462 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da9727b_1328_4d04_8ab2_cda650280a23.slice/crio-0464fd697e6efde199097e31a38a1b2f1ccd43e4ba959d887157ff4ce982927f WatchSource:0}: Error finding container 0464fd697e6efde199097e31a38a1b2f1ccd43e4ba959d887157ff4ce982927f: Status 404 returned error can't find the container with id 0464fd697e6efde199097e31a38a1b2f1ccd43e4ba959d887157ff4ce982927f Apr 24 21:31:14.119918 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:14.119882 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g"] Apr 24 21:31:14.122815 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:14.122796 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod993d1bcd_a92a_4fbb_b948_a5b5eeca31e8.slice/crio-e0134f5d3c856c7d6ca69f1ddd3639e2e275bfcf72cb3cf1fa80ed0990fc9dc1 WatchSource:0}: Error finding container e0134f5d3c856c7d6ca69f1ddd3639e2e275bfcf72cb3cf1fa80ed0990fc9dc1: Status 404 returned error can't find the container with id e0134f5d3c856c7d6ca69f1ddd3639e2e275bfcf72cb3cf1fa80ed0990fc9dc1 Apr 24 21:31:14.323571 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:14.323544 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:14.323699 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:14.323664 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:31:14.323761 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:14.323727 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls podName:acc9bae7-d9ba-4805-b6a8-4148ae510868 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:15.323709487 +0000 UTC m=+137.285474511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gwpbj" (UID: "acc9bae7-d9ba-4805-b6a8-4148ae510868") : secret "samples-operator-tls" not found Apr 24 21:31:14.979332 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:14.979279 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" event={"ID":"5da9727b-1328-4d04-8ab2-cda650280a23","Type":"ContainerStarted","Data":"0464fd697e6efde199097e31a38a1b2f1ccd43e4ba959d887157ff4ce982927f"} Apr 24 21:31:14.980306 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:14.980277 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" event={"ID":"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8","Type":"ContainerStarted","Data":"e0134f5d3c856c7d6ca69f1ddd3639e2e275bfcf72cb3cf1fa80ed0990fc9dc1"} Apr 24 21:31:15.332409 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:15.332373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:15.332581 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:15.332544 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:31:15.332632 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:15.332624 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls podName:acc9bae7-d9ba-4805-b6a8-4148ae510868 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:17.332603347 +0000 UTC m=+139.294368351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gwpbj" (UID: "acc9bae7-d9ba-4805-b6a8-4148ae510868") : secret "samples-operator-tls" not found Apr 24 21:31:16.986888 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:16.986847 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" event={"ID":"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8","Type":"ContainerStarted","Data":"e479648573196c699d3c33e7b45f3510df44cad6ef25339cb0c15b36de34cd88"} Apr 24 21:31:16.988296 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:16.988277 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/0.log" Apr 24 21:31:16.988436 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:16.988318 2566 generic.go:358] "Generic (PLEG): container finished" podID="5da9727b-1328-4d04-8ab2-cda650280a23" containerID="b472f78ff5a5394d7d718081adbc8c3c21b55713fde795c719a26cca4b45f7f3" exitCode=255 Apr 24 21:31:16.988436 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:16.988365 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" event={"ID":"5da9727b-1328-4d04-8ab2-cda650280a23","Type":"ContainerDied","Data":"b472f78ff5a5394d7d718081adbc8c3c21b55713fde795c719a26cca4b45f7f3"} Apr 24 21:31:16.988601 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:16.988585 2566 scope.go:117] "RemoveContainer" containerID="b472f78ff5a5394d7d718081adbc8c3c21b55713fde795c719a26cca4b45f7f3" Apr 24 21:31:17.004279 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.004243 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" podStartSLOduration=1.714789688 podStartE2EDuration="4.004230652s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="2026-04-24 21:31:14.124263344 +0000 UTC m=+136.086028343" lastFinishedPulling="2026-04-24 21:31:16.413704306 +0000 UTC m=+138.375469307" observedRunningTime="2026-04-24 21:31:17.003445832 +0000 UTC m=+138.965210853" watchObservedRunningTime="2026-04-24 21:31:17.004230652 +0000 UTC m=+138.965995663" Apr 24 21:31:17.213434 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.213404 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-m789j"] Apr 24 21:31:17.216166 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.216151 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-m789j" Apr 24 21:31:17.218569 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.218549 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7n4gd\"" Apr 24 21:31:17.225806 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.225786 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-m789j"] Apr 24 21:31:17.351955 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.351933 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:17.352091 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.351992 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbhk\" (UniqueName: \"kubernetes.io/projected/1bf8df67-f18f-4c8d-816d-cd4a03327ba3-kube-api-access-wzbhk\") pod \"network-check-source-8894fc9bd-m789j\" (UID: \"1bf8df67-f18f-4c8d-816d-cd4a03327ba3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-m789j" Apr 24 21:31:17.352091 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:17.352069 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:31:17.352170 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:17.352154 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls podName:acc9bae7-d9ba-4805-b6a8-4148ae510868 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:21.352140211 +0000 UTC m=+143.313905209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gwpbj" (UID: "acc9bae7-d9ba-4805-b6a8-4148ae510868") : secret "samples-operator-tls" not found Apr 24 21:31:17.452534 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.452509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbhk\" (UniqueName: \"kubernetes.io/projected/1bf8df67-f18f-4c8d-816d-cd4a03327ba3-kube-api-access-wzbhk\") pod \"network-check-source-8894fc9bd-m789j\" (UID: \"1bf8df67-f18f-4c8d-816d-cd4a03327ba3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-m789j" Apr 24 21:31:17.461190 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.461167 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbhk\" (UniqueName: \"kubernetes.io/projected/1bf8df67-f18f-4c8d-816d-cd4a03327ba3-kube-api-access-wzbhk\") pod \"network-check-source-8894fc9bd-m789j\" (UID: \"1bf8df67-f18f-4c8d-816d-cd4a03327ba3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-m789j" Apr 24 21:31:17.524094 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.524066 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-m789j" Apr 24 21:31:17.654682 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.654661 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-m789j"] Apr 24 21:31:17.657567 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:17.657531 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bf8df67_f18f_4c8d_816d_cd4a03327ba3.slice/crio-5691bbfa118c60fdac3ec299714485a1e450eb768bd9fb483a33788345123e1b WatchSource:0}: Error finding container 5691bbfa118c60fdac3ec299714485a1e450eb768bd9fb483a33788345123e1b: Status 404 returned error can't find the container with id 5691bbfa118c60fdac3ec299714485a1e450eb768bd9fb483a33788345123e1b Apr 24 21:31:17.992285 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.992209 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/1.log" Apr 24 21:31:17.992725 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.992588 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/0.log" Apr 24 21:31:17.992725 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.992623 2566 generic.go:358] "Generic (PLEG): container finished" podID="5da9727b-1328-4d04-8ab2-cda650280a23" containerID="4dd00b0e552988ac97412cf36c25f74d4f37b2fe22a2d67f4d882d08680e569a" exitCode=255 Apr 24 21:31:17.992725 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.992656 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" event={"ID":"5da9727b-1328-4d04-8ab2-cda650280a23","Type":"ContainerDied","Data":"4dd00b0e552988ac97412cf36c25f74d4f37b2fe22a2d67f4d882d08680e569a"} Apr 24 21:31:17.992725 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.992700 2566 scope.go:117] "RemoveContainer" containerID="b472f78ff5a5394d7d718081adbc8c3c21b55713fde795c719a26cca4b45f7f3" Apr 24 21:31:17.992967 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.992949 2566 scope.go:117] "RemoveContainer" containerID="4dd00b0e552988ac97412cf36c25f74d4f37b2fe22a2d67f4d882d08680e569a" Apr 24 21:31:17.993151 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:17.993124 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rwc2z_openshift-console-operator(5da9727b-1328-4d04-8ab2-cda650280a23)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" podUID="5da9727b-1328-4d04-8ab2-cda650280a23" Apr 24 21:31:17.994253 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.994231 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-m789j" event={"ID":"1bf8df67-f18f-4c8d-816d-cd4a03327ba3","Type":"ContainerStarted","Data":"89aff9affe31562ec4b374f764a33033c47d052ad5ecd573e73110f33b98153e"} Apr 24 21:31:17.994325 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:17.994260 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-m789j" event={"ID":"1bf8df67-f18f-4c8d-816d-cd4a03327ba3","Type":"ContainerStarted","Data":"5691bbfa118c60fdac3ec299714485a1e450eb768bd9fb483a33788345123e1b"} Apr 24 21:31:18.026501 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:18.026460 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-m789j" podStartSLOduration=1.026448119 podStartE2EDuration="1.026448119s" podCreationTimestamp="2026-04-24 21:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:18.025544884 +0000 UTC m=+139.987309993" watchObservedRunningTime="2026-04-24 21:31:18.026448119 +0000 UTC m=+139.988213180" Apr 24 21:31:18.998496 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:18.998466 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/1.log" Apr 24 21:31:18.998976 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:18.998959 2566 scope.go:117] "RemoveContainer" containerID="4dd00b0e552988ac97412cf36c25f74d4f37b2fe22a2d67f4d882d08680e569a" Apr 24 21:31:18.999169 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:18.999149 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rwc2z_openshift-console-operator(5da9727b-1328-4d04-8ab2-cda650280a23)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" podUID="5da9727b-1328-4d04-8ab2-cda650280a23" Apr 24 21:31:19.989187 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:19.989156 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hsr9s"] Apr 24 21:31:19.991992 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:19.991976 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:19.994504 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:19.994479 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:31:19.994617 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:19.994482 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:31:19.995498 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:19.995474 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:31:19.995611 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:19.995554 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lmzkj\"" Apr 24 21:31:19.996045 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:19.996030 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:31:20.002925 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.002899 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hsr9s"] Apr 24 21:31:20.073927 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.073904 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e9d4a53f-2edd-474a-bf31-a1d84186b457-signing-key\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.074046 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.073943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfn7h\" (UniqueName: \"kubernetes.io/projected/e9d4a53f-2edd-474a-bf31-a1d84186b457-kube-api-access-wfn7h\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.074046 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.073966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e9d4a53f-2edd-474a-bf31-a1d84186b457-signing-cabundle\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.175117 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.175086 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e9d4a53f-2edd-474a-bf31-a1d84186b457-signing-key\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.175251 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.175131 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfn7h\" (UniqueName: \"kubernetes.io/projected/e9d4a53f-2edd-474a-bf31-a1d84186b457-kube-api-access-wfn7h\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.175251 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.175167 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e9d4a53f-2edd-474a-bf31-a1d84186b457-signing-cabundle\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.175797 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.175776 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e9d4a53f-2edd-474a-bf31-a1d84186b457-signing-cabundle\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.177422 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.177405 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e9d4a53f-2edd-474a-bf31-a1d84186b457-signing-key\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.184040 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.184018 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfn7h\" (UniqueName: \"kubernetes.io/projected/e9d4a53f-2edd-474a-bf31-a1d84186b457-kube-api-access-wfn7h\") pod \"service-ca-865cb79987-hsr9s\" (UID: \"e9d4a53f-2edd-474a-bf31-a1d84186b457\") " pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.301223 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.301200 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hsr9s" Apr 24 21:31:20.416172 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.416143 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hsr9s"] Apr 24 21:31:20.419473 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:20.419445 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d4a53f_2edd_474a_bf31_a1d84186b457.slice/crio-fdf37be7058f08a5a15d0863d08ac2a3483cede9c12624ede90271e195d482db WatchSource:0}: Error finding container fdf37be7058f08a5a15d0863d08ac2a3483cede9c12624ede90271e195d482db: Status 404 returned error can't find the container with id fdf37be7058f08a5a15d0863d08ac2a3483cede9c12624ede90271e195d482db Apr 24 21:31:20.839517 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:20.839487 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-88hpr_4ca03565-9d72-4813-9179-7636908c9bf5/dns-node-resolver/0.log" Apr 24 21:31:21.005119 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:21.005087 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hsr9s" event={"ID":"e9d4a53f-2edd-474a-bf31-a1d84186b457","Type":"ContainerStarted","Data":"289adc7eda244dd1f1936b7a3f0a65c292b9e5236502d7a439f9156d31a78d13"} Apr 24 21:31:21.005460 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:21.005125 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hsr9s" event={"ID":"e9d4a53f-2edd-474a-bf31-a1d84186b457","Type":"ContainerStarted","Data":"fdf37be7058f08a5a15d0863d08ac2a3483cede9c12624ede90271e195d482db"} Apr 24 21:31:21.025242 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:21.025202 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-hsr9s" podStartSLOduration=2.025186332 podStartE2EDuration="2.025186332s" podCreationTimestamp="2026-04-24 21:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:21.024202142 +0000 UTC m=+142.985967217" watchObservedRunningTime="2026-04-24 21:31:21.025186332 +0000 UTC m=+142.986951352" Apr 24 21:31:21.382971 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:21.382928 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:21.383146 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:21.383092 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:31:21.383193 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:21.383182 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls podName:acc9bae7-d9ba-4805-b6a8-4148ae510868 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:29.383163382 +0000 UTC m=+151.344928395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gwpbj" (UID: "acc9bae7-d9ba-4805-b6a8-4148ae510868") : secret "samples-operator-tls" not found Apr 24 21:31:21.839481 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:21.839438 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4rz25_918a384b-26d7-496e-b3c9-370b6e526ebd/node-ca/0.log" Apr 24 21:31:23.972407 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:23.972370 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:23.972407 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:23.972410 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:23.972886 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:23.972730 2566 scope.go:117] "RemoveContainer" containerID="4dd00b0e552988ac97412cf36c25f74d4f37b2fe22a2d67f4d882d08680e569a" Apr 24 21:31:23.972951 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:23.972893 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rwc2z_openshift-console-operator(5da9727b-1328-4d04-8ab2-cda650280a23)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" podUID="5da9727b-1328-4d04-8ab2-cda650280a23" Apr 24 21:31:29.445398 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:29.445335 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:29.447686 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:29.447664 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc9bae7-d9ba-4805-b6a8-4148ae510868-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gwpbj\" (UID: \"acc9bae7-d9ba-4805-b6a8-4148ae510868\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:29.583831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:29.583801 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" Apr 24 21:31:29.704327 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:29.704153 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj"] Apr 24 21:31:30.026616 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:30.026575 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" event={"ID":"acc9bae7-d9ba-4805-b6a8-4148ae510868","Type":"ContainerStarted","Data":"d3379dd1b5c918be6a4e29578846a2da068f435c4a14613a4568d3c7d111dfe8"} Apr 24 21:31:32.032980 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:32.032940 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" event={"ID":"acc9bae7-d9ba-4805-b6a8-4148ae510868","Type":"ContainerStarted","Data":"e520ead76578a6bd121c58c062eca40fd728fa7fdaceac983a8cec194b417453"} Apr 24 21:31:32.032980 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:32.032984 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" event={"ID":"acc9bae7-d9ba-4805-b6a8-4148ae510868","Type":"ContainerStarted","Data":"c23cca4a05f70ad37f70167be0f572153758a54371b950d1606d875b204c4c98"} Apr 24 21:31:32.055308 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:32.055253 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gwpbj" podStartSLOduration=17.472569803 podStartE2EDuration="19.055239868s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="2026-04-24 21:31:29.744908177 +0000 UTC m=+151.706673179" lastFinishedPulling="2026-04-24 21:31:31.327578231 +0000 UTC m=+153.289343244" observedRunningTime="2026-04-24 21:31:32.054435901 +0000 UTC m=+154.016200922" watchObservedRunningTime="2026-04-24 21:31:32.055239868 +0000 UTC m=+154.017004886" Apr 24 21:31:33.933896 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:33.933862 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-z2pdw" podUID="cc70ee65-8f01-4bad-adc7-98b5e7037c77" Apr 24 21:31:33.940045 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:33.940013 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6jxzc" podUID="e77a2c8d-6381-4990-b43d-cfa87c8c3fc4" Apr 24 21:31:34.037215 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:34.037185 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:31:34.037341 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:34.037188 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6jxzc" Apr 24 21:31:35.535244 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:35.535214 2566 scope.go:117] "RemoveContainer" containerID="4dd00b0e552988ac97412cf36c25f74d4f37b2fe22a2d67f4d882d08680e569a" Apr 24 21:31:35.546641 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:35.546616 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-q52j5" podUID="186d543e-d0f4-4e11-ac28-e8ebb35c72a2" Apr 24 21:31:36.042325 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:36.042300 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:31:36.042716 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:36.042700 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/1.log" Apr 24 21:31:36.042788 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:36.042731 2566 generic.go:358] "Generic (PLEG): container finished" podID="5da9727b-1328-4d04-8ab2-cda650280a23" containerID="e464c319d12d85b619dcd6e9d243ded927ffb8063be9258632a3bd1678c96f7d" exitCode=255 Apr 24 21:31:36.042827 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:36.042791 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" event={"ID":"5da9727b-1328-4d04-8ab2-cda650280a23","Type":"ContainerDied","Data":"e464c319d12d85b619dcd6e9d243ded927ffb8063be9258632a3bd1678c96f7d"} Apr 24 21:31:36.042827 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:36.042821 2566 scope.go:117] "RemoveContainer" containerID="4dd00b0e552988ac97412cf36c25f74d4f37b2fe22a2d67f4d882d08680e569a" Apr 24 21:31:36.043106 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:36.043085 2566 scope.go:117] "RemoveContainer" containerID="e464c319d12d85b619dcd6e9d243ded927ffb8063be9258632a3bd1678c96f7d" Apr 24 21:31:36.043282 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:36.043260 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rwc2z_openshift-console-operator(5da9727b-1328-4d04-8ab2-cda650280a23)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" podUID="5da9727b-1328-4d04-8ab2-cda650280a23" Apr 24 21:31:37.045892 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:37.045864 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:31:38.911912 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:38.911831 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:31:38.911912 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:38.911881 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:31:38.914096 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:38.914074 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e77a2c8d-6381-4990-b43d-cfa87c8c3fc4-metrics-tls\") pod \"dns-default-6jxzc\" (UID: \"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4\") " pod="openshift-dns/dns-default-6jxzc" Apr 24 21:31:38.914198 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:38.914181 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc70ee65-8f01-4bad-adc7-98b5e7037c77-cert\") pod \"ingress-canary-z2pdw\" (UID: \"cc70ee65-8f01-4bad-adc7-98b5e7037c77\") " pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:31:39.140645 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:39.140619 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sbjvw\"" Apr 24 21:31:39.141521 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:39.141500 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kxf55\"" Apr 24 21:31:39.148720 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:39.148703 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z2pdw" Apr 24 21:31:39.148818 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:39.148723 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6jxzc" Apr 24 21:31:39.272554 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:39.272523 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z2pdw"] Apr 24 21:31:39.276240 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:39.276210 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc70ee65_8f01_4bad_adc7_98b5e7037c77.slice/crio-a17a94ef6946d1df8068ad72dc59c271353d2ced55522114d317fbad04be4894 WatchSource:0}: Error finding container a17a94ef6946d1df8068ad72dc59c271353d2ced55522114d317fbad04be4894: Status 404 returned error can't find the container with id a17a94ef6946d1df8068ad72dc59c271353d2ced55522114d317fbad04be4894 Apr 24 21:31:39.286747 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:39.286725 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6jxzc"] Apr 24 21:31:39.289228 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:39.289204 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode77a2c8d_6381_4990_b43d_cfa87c8c3fc4.slice/crio-fa8fd31920a6b995966acd57a6640cfd692980ae9192b9dcc5d377b8bad4fdf5 WatchSource:0}: Error finding container fa8fd31920a6b995966acd57a6640cfd692980ae9192b9dcc5d377b8bad4fdf5: Status 404 returned error can't find the container with id fa8fd31920a6b995966acd57a6640cfd692980ae9192b9dcc5d377b8bad4fdf5 Apr 24 21:31:40.054416 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:40.054373 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z2pdw" event={"ID":"cc70ee65-8f01-4bad-adc7-98b5e7037c77","Type":"ContainerStarted","Data":"a17a94ef6946d1df8068ad72dc59c271353d2ced55522114d317fbad04be4894"} Apr 24 21:31:40.055943 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:40.055901 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6jxzc" event={"ID":"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4","Type":"ContainerStarted","Data":"fa8fd31920a6b995966acd57a6640cfd692980ae9192b9dcc5d377b8bad4fdf5"} Apr 24 21:31:42.062832 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:42.062800 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6jxzc" event={"ID":"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4","Type":"ContainerStarted","Data":"4e2bd84e2569d52364ea4bdcd3a85cdb4ca6ed822a56cd38c413b92c674b331c"} Apr 24 21:31:42.062832 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:42.062838 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6jxzc" event={"ID":"e77a2c8d-6381-4990-b43d-cfa87c8c3fc4","Type":"ContainerStarted","Data":"4f404cf1f4612f0fc89dd35722701d5220b41c88598661261528db2a99dee6b6"} Apr 24 21:31:42.063342 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:42.062883 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6jxzc" Apr 24 21:31:42.064024 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:42.064004 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z2pdw" event={"ID":"cc70ee65-8f01-4bad-adc7-98b5e7037c77","Type":"ContainerStarted","Data":"6359c931e43def6d1cdd6ffb234d9f58ff57c4bb73d19b87acebb7c3b8ca51ce"} Apr 24 21:31:42.083338 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:42.083298 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6jxzc" podStartSLOduration=130.277246205 podStartE2EDuration="2m12.083287995s" podCreationTimestamp="2026-04-24 21:29:30 +0000 UTC" firstStartedPulling="2026-04-24 21:31:39.290915214 +0000 UTC m=+161.252680212" lastFinishedPulling="2026-04-24 21:31:41.096956994 +0000 UTC m=+163.058722002" observedRunningTime="2026-04-24 21:31:42.083136817 +0000 UTC m=+164.044901838" watchObservedRunningTime="2026-04-24 21:31:42.083287995 +0000 UTC m=+164.045053014" Apr 24 21:31:42.106786 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:42.106737 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z2pdw" podStartSLOduration=130.284359555 podStartE2EDuration="2m12.106726876s" podCreationTimestamp="2026-04-24 21:29:30 +0000 UTC" firstStartedPulling="2026-04-24 21:31:39.27828535 +0000 UTC m=+161.240050362" lastFinishedPulling="2026-04-24 21:31:41.100652669 +0000 UTC m=+163.062417683" observedRunningTime="2026-04-24 21:31:42.106255043 +0000 UTC m=+164.068020064" watchObservedRunningTime="2026-04-24 21:31:42.106726876 +0000 UTC m=+164.068491895" Apr 24 21:31:43.972386 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:43.972346 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:43.972734 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:43.972430 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:31:43.972734 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:43.972649 2566 scope.go:117] "RemoveContainer" containerID="e464c319d12d85b619dcd6e9d243ded927ffb8063be9258632a3bd1678c96f7d" Apr 24 21:31:43.972804 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:43.972788 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rwc2z_openshift-console-operator(5da9727b-1328-4d04-8ab2-cda650280a23)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" podUID="5da9727b-1328-4d04-8ab2-cda650280a23" Apr 24 21:31:44.069563 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:44.069538 2566 scope.go:117] "RemoveContainer" containerID="e464c319d12d85b619dcd6e9d243ded927ffb8063be9258632a3bd1678c96f7d" Apr 24 21:31:44.069706 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:44.069689 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rwc2z_openshift-console-operator(5da9727b-1328-4d04-8ab2-cda650280a23)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" podUID="5da9727b-1328-4d04-8ab2-cda650280a23" Apr 24 21:31:46.476616 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.476583 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2bz6f"] Apr 24 21:31:46.479927 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.479912 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.483555 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.483529 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r5h69\"" Apr 24 21:31:46.483679 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.483538 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:31:46.483679 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.483582 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:31:46.483679 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.483538 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:31:46.483679 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.483607 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:31:46.497667 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.497647 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2bz6f"] Apr 24 21:31:46.567499 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.567466 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.567499 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.567497 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bn4b\" (UniqueName: \"kubernetes.io/projected/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-kube-api-access-7bn4b\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.567686 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.567526 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.567686 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.567604 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-crio-socket\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.567686 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.567662 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-data-volume\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.575273 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.575243 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-77758d8cd8-5h489"] Apr 24 21:31:46.578208 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.578188 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.582692 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.582664 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:31:46.583200 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.583174 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:31:46.584086 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.584064 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ptjx6\"" Apr 24 21:31:46.584737 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.584714 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:31:46.591086 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.591062 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:31:46.591883 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.591861 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77758d8cd8-5h489"] Apr 24 21:31:46.668831 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.668799 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bn4b\" (UniqueName: \"kubernetes.io/projected/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-kube-api-access-7bn4b\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.668992 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.668838 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-bound-sa-token\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.668992 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.668860 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-installation-pull-secrets\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.668992 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.668905 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmpsk\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-kube-api-access-lmpsk\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.669134 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669110 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.669193 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669176 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-registry-certificates\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.669259 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-data-volume\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.669312 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669274 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-image-registry-private-configuration\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.669394 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669313 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.669394 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669339 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-ca-trust-extracted\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.669508 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669394 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-trusted-ca\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.669508 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669463 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-crio-socket\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.669508 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669496 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-registry-tls\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.669648 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669558 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-crio-socket\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.669723 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-data-volume\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.669857 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.669837 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.671761 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.671741 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.696368 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.695410 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bn4b\" (UniqueName: \"kubernetes.io/projected/4b511ab9-78fa-467d-88fa-1910c1b4f6bd-kube-api-access-7bn4b\") pod \"insights-runtime-extractor-2bz6f\" (UID: \"4b511ab9-78fa-467d-88fa-1910c1b4f6bd\") " pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.770826 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.770793 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-registry-certificates\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771001 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.770844 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-image-registry-private-configuration\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771001 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.770884 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-ca-trust-extracted\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771001 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.770914 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-trusted-ca\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771001 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.770942 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-registry-tls\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771001 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.770987 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-bound-sa-token\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771247 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.771017 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-installation-pull-secrets\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771247 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.771043 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmpsk\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-kube-api-access-lmpsk\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771408 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.771383 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-ca-trust-extracted\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771743 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.771721 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-registry-certificates\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.771850 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.771803 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-trusted-ca\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.773557 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.773534 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-installation-pull-secrets\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.773653 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.773563 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-image-registry-private-configuration\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.774269 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.774249 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-registry-tls\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.780622 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.780593 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-bound-sa-token\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.780999 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.780981 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmpsk\" (UniqueName: \"kubernetes.io/projected/ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b-kube-api-access-lmpsk\") pod \"image-registry-77758d8cd8-5h489\" (UID: \"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b\") " pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.788770 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.788750 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2bz6f" Apr 24 21:31:46.889233 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.889198 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:46.909167 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:46.909070 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2bz6f"] Apr 24 21:31:46.911619 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:46.911587 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b511ab9_78fa_467d_88fa_1910c1b4f6bd.slice/crio-cf9b6556c560ea51bdecff347aafe5a3185f895dd5222907ff93e6e491c8d9e2 WatchSource:0}: Error finding container cf9b6556c560ea51bdecff347aafe5a3185f895dd5222907ff93e6e491c8d9e2: Status 404 returned error can't find the container with id cf9b6556c560ea51bdecff347aafe5a3185f895dd5222907ff93e6e491c8d9e2 Apr 24 21:31:47.016968 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:47.016939 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77758d8cd8-5h489"] Apr 24 21:31:47.020160 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:47.020133 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2da17b_d0d2_42ab_9cbc_b5b16c6c971b.slice/crio-b5006b8bc7f47041f9739a3bd6304fca8db992bb8f4eec7fa7d8722d0d0ca7d2 WatchSource:0}: Error finding container b5006b8bc7f47041f9739a3bd6304fca8db992bb8f4eec7fa7d8722d0d0ca7d2: Status 404 returned error can't find the container with id b5006b8bc7f47041f9739a3bd6304fca8db992bb8f4eec7fa7d8722d0d0ca7d2 Apr 24 21:31:47.083269 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:47.083243 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77758d8cd8-5h489" event={"ID":"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b","Type":"ContainerStarted","Data":"b5006b8bc7f47041f9739a3bd6304fca8db992bb8f4eec7fa7d8722d0d0ca7d2"} Apr 24 21:31:47.084470 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:47.084448 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2bz6f" event={"ID":"4b511ab9-78fa-467d-88fa-1910c1b4f6bd","Type":"ContainerStarted","Data":"8cf807466e0519492c48784176dfaa8eb0927727bd323edd3a17757e78d4a7d1"} Apr 24 21:31:47.084564 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:47.084479 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2bz6f" event={"ID":"4b511ab9-78fa-467d-88fa-1910c1b4f6bd","Type":"ContainerStarted","Data":"cf9b6556c560ea51bdecff347aafe5a3185f895dd5222907ff93e6e491c8d9e2"} Apr 24 21:31:48.090490 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:48.090456 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77758d8cd8-5h489" event={"ID":"ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b","Type":"ContainerStarted","Data":"7ddba08f47603d52e36944cc43dda6edc7228cd4d09c5c3b7b9c19e502accb9f"} Apr 24 21:31:48.090956 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:48.090544 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:31:48.092072 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:48.092049 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2bz6f" event={"ID":"4b511ab9-78fa-467d-88fa-1910c1b4f6bd","Type":"ContainerStarted","Data":"1e38f10fad2ced88cc72eb808df0faed8e1c76061c292ba42d33d52f08f952e1"} Apr 24 21:31:48.112574 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:48.112537 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-77758d8cd8-5h489" podStartSLOduration=2.112523789 podStartE2EDuration="2.112523789s" podCreationTimestamp="2026-04-24 21:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:48.112521975 +0000 UTC m=+170.074287005" watchObservedRunningTime="2026-04-24 21:31:48.112523789 +0000 UTC m=+170.074288812" Apr 24 21:31:50.099473 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:50.099437 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2bz6f" event={"ID":"4b511ab9-78fa-467d-88fa-1910c1b4f6bd","Type":"ContainerStarted","Data":"ae3190264054def6efa6f993a2693003194165afa5a5f370ad4977da0d758a08"} Apr 24 21:31:50.122997 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:50.122950 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2bz6f" podStartSLOduration=2.06075193 podStartE2EDuration="4.12293832s" podCreationTimestamp="2026-04-24 21:31:46 +0000 UTC" firstStartedPulling="2026-04-24 21:31:46.988467723 +0000 UTC m=+168.950232728" lastFinishedPulling="2026-04-24 21:31:49.05065412 +0000 UTC m=+171.012419118" observedRunningTime="2026-04-24 21:31:50.122196453 +0000 UTC m=+172.083961485" watchObservedRunningTime="2026-04-24 21:31:50.12293832 +0000 UTC m=+172.084703323" Apr 24 21:31:50.535112 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:50.535080 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:31:52.068960 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:52.068932 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6jxzc" Apr 24 21:31:55.296927 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.296892 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vhvw5"] Apr 24 21:31:55.302182 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.302162 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.305366 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.305331 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:31:55.305666 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.305647 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:31:55.306312 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.306292 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:31:55.306456 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.306432 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:31:55.306564 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.306299 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zmm79\"" Apr 24 21:31:55.306564 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.306330 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:31:55.306720 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.306325 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:31:55.434673 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.434651 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-tls\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.434673 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.434676 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-wtmp\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.434854 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.434692 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.434854 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.434710 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-sys\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.434928 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.434843 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-root\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.434928 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.434884 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-accelerators-collector-config\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.434928 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.434915 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsnv\" (UniqueName: \"kubernetes.io/projected/7403fa15-7ed1-496e-83fd-fd72a2f75042-kube-api-access-bqsnv\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.435031 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.434958 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7403fa15-7ed1-496e-83fd-fd72a2f75042-metrics-client-ca\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.435031 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.435012 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-textfile\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535275 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535251 2566 scope.go:117] "RemoveContainer" containerID="e464c319d12d85b619dcd6e9d243ded927ffb8063be9258632a3bd1678c96f7d" Apr 24 21:31:55.535419 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535359 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7403fa15-7ed1-496e-83fd-fd72a2f75042-metrics-client-ca\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535419 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535411 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-textfile\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535442 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-tls\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535532 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:31:55.535452 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rwc2z_openshift-console-operator(5da9727b-1328-4d04-8ab2-cda650280a23)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" podUID="5da9727b-1328-4d04-8ab2-cda650280a23" Apr 24 21:31:55.535532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535466 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-wtmp\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535532 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535718 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535631 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-wtmp\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535718 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-sys\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535815 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535740 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-textfile\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535815 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535748 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-root\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535815 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535775 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-accelerators-collector-config\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.535815 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535809 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsnv\" (UniqueName: \"kubernetes.io/projected/7403fa15-7ed1-496e-83fd-fd72a2f75042-kube-api-access-bqsnv\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.536001 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535828 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-root\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.536001 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535810 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7403fa15-7ed1-496e-83fd-fd72a2f75042-sys\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.536001 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.535934 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7403fa15-7ed1-496e-83fd-fd72a2f75042-metrics-client-ca\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.536284 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.536262 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-accelerators-collector-config\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.538102 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.538084 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.538575 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.538554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7403fa15-7ed1-496e-83fd-fd72a2f75042-node-exporter-tls\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.544059 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.544040 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsnv\" (UniqueName: \"kubernetes.io/projected/7403fa15-7ed1-496e-83fd-fd72a2f75042-kube-api-access-bqsnv\") pod \"node-exporter-vhvw5\" (UID: \"7403fa15-7ed1-496e-83fd-fd72a2f75042\") " pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.613302 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:55.613234 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vhvw5" Apr 24 21:31:55.622669 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:55.622642 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7403fa15_7ed1_496e_83fd_fd72a2f75042.slice/crio-37ee9f8d3f16ddabf28b0e896a7505ec95da7df490df60b14339a310648f4141 WatchSource:0}: Error finding container 37ee9f8d3f16ddabf28b0e896a7505ec95da7df490df60b14339a310648f4141: Status 404 returned error can't find the container with id 37ee9f8d3f16ddabf28b0e896a7505ec95da7df490df60b14339a310648f4141 Apr 24 21:31:56.114981 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:56.114940 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vhvw5" event={"ID":"7403fa15-7ed1-496e-83fd-fd72a2f75042","Type":"ContainerStarted","Data":"37ee9f8d3f16ddabf28b0e896a7505ec95da7df490df60b14339a310648f4141"} Apr 24 21:31:57.118486 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.118448 2566 generic.go:358] "Generic (PLEG): container finished" podID="7403fa15-7ed1-496e-83fd-fd72a2f75042" containerID="43189ae7f38a52e4c407d745a6e2a856dd9997c59e31a1f257ff6979637727aa" exitCode=0 Apr 24 21:31:57.118917 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.118496 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vhvw5" event={"ID":"7403fa15-7ed1-496e-83fd-fd72a2f75042","Type":"ContainerDied","Data":"43189ae7f38a52e4c407d745a6e2a856dd9997c59e31a1f257ff6979637727aa"} Apr 24 21:31:57.389648 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.389575 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-fb4645959-rscd8"] Apr 24 21:31:57.392992 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.392977 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.395524 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.395498 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:31:57.395654 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.395540 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:31:57.395654 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.395515 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:31:57.395654 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.395590 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-b5u6eb6acrt2j\"" Apr 24 21:31:57.395654 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.395506 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:31:57.395866 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.395662 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:31:57.395954 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.395934 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-r2g2d\"" Apr 24 21:31:57.409397 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.409377 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-fb4645959-rscd8"] Apr 24 21:31:57.552891 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.552854 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.553036 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.552898 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-tls\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.553036 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.552929 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.553036 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.552979 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czs9t\" (UniqueName: \"kubernetes.io/projected/b62f681b-275b-4e16-b606-abc9771b4539-kube-api-access-czs9t\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.553036 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.553033 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.553168 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.553064 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-grpc-tls\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.553168 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.553125 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.553168 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.553159 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62f681b-275b-4e16-b606-abc9771b4539-metrics-client-ca\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.654362 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.654284 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-grpc-tls\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.654362 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.654330 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.654538 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.654378 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62f681b-275b-4e16-b606-abc9771b4539-metrics-client-ca\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.654538 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.654398 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.654538 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.654416 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-tls\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.654538 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.654435 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.654538 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.654453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czs9t\" (UniqueName: \"kubernetes.io/projected/b62f681b-275b-4e16-b606-abc9771b4539-kube-api-access-czs9t\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.654767 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.654567 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.655411 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.655388 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62f681b-275b-4e16-b606-abc9771b4539-metrics-client-ca\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.656954 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.656926 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-tls\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.657105 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.657083 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-grpc-tls\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.657715 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.657689 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.657817 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.657757 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.657817 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.657779 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.657912 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.657889 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b62f681b-275b-4e16-b606-abc9771b4539-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.667062 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.667038 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czs9t\" (UniqueName: \"kubernetes.io/projected/b62f681b-275b-4e16-b606-abc9771b4539-kube-api-access-czs9t\") pod \"thanos-querier-fb4645959-rscd8\" (UID: \"b62f681b-275b-4e16-b606-abc9771b4539\") " pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.701781 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.701755 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:31:57.821924 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:57.821892 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-fb4645959-rscd8"] Apr 24 21:31:57.825575 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:31:57.825547 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb62f681b_275b_4e16_b606_abc9771b4539.slice/crio-f45e1cd12f7abfdd0ffac36aa71eeb1291347ce8bcce85ee167553dd2d384a33 WatchSource:0}: Error finding container f45e1cd12f7abfdd0ffac36aa71eeb1291347ce8bcce85ee167553dd2d384a33: Status 404 returned error can't find the container with id f45e1cd12f7abfdd0ffac36aa71eeb1291347ce8bcce85ee167553dd2d384a33 Apr 24 21:31:58.121966 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:58.121925 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" event={"ID":"b62f681b-275b-4e16-b606-abc9771b4539","Type":"ContainerStarted","Data":"f45e1cd12f7abfdd0ffac36aa71eeb1291347ce8bcce85ee167553dd2d384a33"} Apr 24 21:31:58.123670 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:58.123647 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vhvw5" event={"ID":"7403fa15-7ed1-496e-83fd-fd72a2f75042","Type":"ContainerStarted","Data":"694a1bc357193f54b52bdb970d94feeeb8077fa805bc8fe5519335a06265532e"} Apr 24 21:31:58.123775 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:58.123675 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vhvw5" event={"ID":"7403fa15-7ed1-496e-83fd-fd72a2f75042","Type":"ContainerStarted","Data":"5f4c4b05f430d6905043b47a3f0de73c5162a4cc3f1b3a19a014d9fd21329029"} Apr 24 21:31:58.145005 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:31:58.144959 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vhvw5" podStartSLOduration=2.348843409 podStartE2EDuration="3.144945227s" podCreationTimestamp="2026-04-24 21:31:55 +0000 UTC" firstStartedPulling="2026-04-24 21:31:55.624979808 +0000 UTC m=+177.586744806" lastFinishedPulling="2026-04-24 21:31:56.421081609 +0000 UTC m=+178.382846624" observedRunningTime="2026-04-24 21:31:58.144752043 +0000 UTC m=+180.106517064" watchObservedRunningTime="2026-04-24 21:31:58.144945227 +0000 UTC m=+180.106710248" Apr 24 21:32:00.130862 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:00.130830 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" event={"ID":"b62f681b-275b-4e16-b606-abc9771b4539","Type":"ContainerStarted","Data":"c614255f4942301d7bc3f766bcb618b30f225f607fc50462750684992b4bbef5"} Apr 24 21:32:00.130862 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:00.130864 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" event={"ID":"b62f681b-275b-4e16-b606-abc9771b4539","Type":"ContainerStarted","Data":"15ef8e1e388695ce54173e39d2e1ff72b9ba124d10f18b5c4b0977c720de7c0e"} Apr 24 21:32:00.131248 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:00.130874 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" event={"ID":"b62f681b-275b-4e16-b606-abc9771b4539","Type":"ContainerStarted","Data":"9a49cc0470f112f8b3c12f158336798a3854d46162adb4c1a4cd1fe3582032b3"} Apr 24 21:32:01.135782 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:01.135747 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" event={"ID":"b62f681b-275b-4e16-b606-abc9771b4539","Type":"ContainerStarted","Data":"1ce92be2d64cc57221bf60dd807f4b61312eff440ad506777621118e0a9a0b1f"} Apr 24 21:32:01.136154 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:01.135791 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" event={"ID":"b62f681b-275b-4e16-b606-abc9771b4539","Type":"ContainerStarted","Data":"2e3981ad9d2e27df0c4a2ea0299ac798ea48ed60d09495c5c1235b4df6b55655"} Apr 24 21:32:01.136154 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:01.135804 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" event={"ID":"b62f681b-275b-4e16-b606-abc9771b4539","Type":"ContainerStarted","Data":"96c85383c09721e5764af94c30fe718b195a5204206b8300ab7771900d2a09fd"} Apr 24 21:32:01.136154 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:01.135959 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:32:01.164146 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:01.164101 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" podStartSLOduration=1.432624406 podStartE2EDuration="4.164087875s" podCreationTimestamp="2026-04-24 21:31:57 +0000 UTC" firstStartedPulling="2026-04-24 21:31:57.827510187 +0000 UTC m=+179.789275185" lastFinishedPulling="2026-04-24 21:32:00.558973646 +0000 UTC m=+182.520738654" observedRunningTime="2026-04-24 21:32:01.162701664 +0000 UTC m=+183.124466681" watchObservedRunningTime="2026-04-24 21:32:01.164087875 +0000 UTC m=+183.125852895" Apr 24 21:32:07.143713 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:07.143683 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-fb4645959-rscd8" Apr 24 21:32:07.534725 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:07.534699 2566 scope.go:117] "RemoveContainer" containerID="e464c319d12d85b619dcd6e9d243ded927ffb8063be9258632a3bd1678c96f7d" Apr 24 21:32:08.155440 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:08.155413 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:32:08.155911 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:08.155477 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" event={"ID":"5da9727b-1328-4d04-8ab2-cda650280a23","Type":"ContainerStarted","Data":"b4b765866146b8aa2ced0830aa91c4d4d1bb0b89174b9f1bfaa27f294e59ba9d"} Apr 24 21:32:08.155911 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:08.155750 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:32:08.174784 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:08.174742 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" podStartSLOduration=52.865874831 podStartE2EDuration="55.174729444s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="2026-04-24 21:31:14.1062024 +0000 UTC m=+136.067967398" lastFinishedPulling="2026-04-24 21:31:16.415057009 +0000 UTC m=+138.376822011" observedRunningTime="2026-04-24 21:32:08.174725603 +0000 UTC m=+190.136490622" watchObservedRunningTime="2026-04-24 21:32:08.174729444 +0000 UTC m=+190.136494488" Apr 24 21:32:08.216074 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:08.216045 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-rwc2z" Apr 24 21:32:09.099521 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:09.099494 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-77758d8cd8-5h489" Apr 24 21:32:47.254489 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:47.254452 2566 generic.go:358] "Generic (PLEG): container finished" podID="993d1bcd-a92a-4fbb-b948-a5b5eeca31e8" containerID="e479648573196c699d3c33e7b45f3510df44cad6ef25339cb0c15b36de34cd88" exitCode=0 Apr 24 21:32:47.254869 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:47.254503 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" event={"ID":"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8","Type":"ContainerDied","Data":"e479648573196c699d3c33e7b45f3510df44cad6ef25339cb0c15b36de34cd88"} Apr 24 21:32:47.254869 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:47.254801 2566 scope.go:117] "RemoveContainer" containerID="e479648573196c699d3c33e7b45f3510df44cad6ef25339cb0c15b36de34cd88" Apr 24 21:32:48.258586 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:32:48.258549 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jd9g" event={"ID":"993d1bcd-a92a-4fbb-b948-a5b5eeca31e8","Type":"ContainerStarted","Data":"abad082a2971c6e1bde2c346f52969ab86158dbd47fc42e8a55feded6d3c3bc0"} Apr 24 21:33:10.287650 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:10.287572 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:33:10.289797 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:10.289775 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/186d543e-d0f4-4e11-ac28-e8ebb35c72a2-metrics-certs\") pod \"network-metrics-daemon-q52j5\" (UID: \"186d543e-d0f4-4e11-ac28-e8ebb35c72a2\") " pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:33:10.338483 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:10.338461 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-flqk8\"" Apr 24 21:33:10.347143 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:10.347120 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q52j5" Apr 24 21:33:10.467591 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:10.467555 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q52j5"] Apr 24 21:33:10.470558 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:33:10.470533 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod186d543e_d0f4_4e11_ac28_e8ebb35c72a2.slice/crio-1cf7f08881814dc7f19fdd1b4f9014d6ba352ac57250117e0a40d27f493738f2 WatchSource:0}: Error finding container 1cf7f08881814dc7f19fdd1b4f9014d6ba352ac57250117e0a40d27f493738f2: Status 404 returned error can't find the container with id 1cf7f08881814dc7f19fdd1b4f9014d6ba352ac57250117e0a40d27f493738f2 Apr 24 21:33:11.318841 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:11.318795 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q52j5" event={"ID":"186d543e-d0f4-4e11-ac28-e8ebb35c72a2","Type":"ContainerStarted","Data":"1cf7f08881814dc7f19fdd1b4f9014d6ba352ac57250117e0a40d27f493738f2"} Apr 24 21:33:12.323245 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:12.323205 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q52j5" event={"ID":"186d543e-d0f4-4e11-ac28-e8ebb35c72a2","Type":"ContainerStarted","Data":"2cad210872d4559c0e72e86f9fbafbd93ce5ba99fa3bbc26cd0cf5bd2c0b094e"} Apr 24 21:33:12.323245 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:12.323242 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q52j5" event={"ID":"186d543e-d0f4-4e11-ac28-e8ebb35c72a2","Type":"ContainerStarted","Data":"a2c8c24db8ddf73cc8f12020a1b0e9f1e0707c5d537476aa94743dbd723d8300"} Apr 24 21:33:12.345120 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:12.345065 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q52j5" podStartSLOduration=253.429057406 podStartE2EDuration="4m14.345049833s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:33:10.472332199 +0000 UTC m=+252.434097198" lastFinishedPulling="2026-04-24 21:33:11.388324617 +0000 UTC m=+253.350089625" observedRunningTime="2026-04-24 21:33:12.343991258 +0000 UTC m=+254.305756278" watchObservedRunningTime="2026-04-24 21:33:12.345049833 +0000 UTC m=+254.306814856" Apr 24 21:33:58.422316 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:58.422281 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:33:58.422878 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:58.422527 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:33:58.432044 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:58.431957 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:33:58.432044 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:33:58.432037 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:35:32.854616 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:32.854587 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gtf2w"] Apr 24 21:35:32.857607 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:32.857582 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:32.860249 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:32.860227 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:35:32.864307 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:32.864287 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gtf2w"] Apr 24 21:35:32.967324 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:32.967286 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/44ef511c-205f-488b-a596-eb35096f1fd7-kubelet-config\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:32.967324 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:32.967331 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44ef511c-205f-488b-a596-eb35096f1fd7-original-pull-secret\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:32.967557 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:32.967438 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/44ef511c-205f-488b-a596-eb35096f1fd7-dbus\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:33.068655 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.068614 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/44ef511c-205f-488b-a596-eb35096f1fd7-kubelet-config\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:33.068809 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.068663 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44ef511c-205f-488b-a596-eb35096f1fd7-original-pull-secret\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:33.068809 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.068719 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/44ef511c-205f-488b-a596-eb35096f1fd7-dbus\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:33.068809 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.068739 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/44ef511c-205f-488b-a596-eb35096f1fd7-kubelet-config\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:33.068962 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.068910 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/44ef511c-205f-488b-a596-eb35096f1fd7-dbus\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:33.070967 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.070949 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/44ef511c-205f-488b-a596-eb35096f1fd7-original-pull-secret\") pod \"global-pull-secret-syncer-gtf2w\" (UID: \"44ef511c-205f-488b-a596-eb35096f1fd7\") " pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:33.166859 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.166782 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtf2w" Apr 24 21:35:33.280468 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.280439 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gtf2w"] Apr 24 21:35:33.282925 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:35:33.282897 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ef511c_205f_488b_a596_eb35096f1fd7.slice/crio-71cc5bc4a12de3837fdf6611631af50e9444074ca5d4a55ac926848ff63bc6c2 WatchSource:0}: Error finding container 71cc5bc4a12de3837fdf6611631af50e9444074ca5d4a55ac926848ff63bc6c2: Status 404 returned error can't find the container with id 71cc5bc4a12de3837fdf6611631af50e9444074ca5d4a55ac926848ff63bc6c2 Apr 24 21:35:33.284556 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.284536 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:35:33.696968 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:33.696941 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gtf2w" event={"ID":"44ef511c-205f-488b-a596-eb35096f1fd7","Type":"ContainerStarted","Data":"71cc5bc4a12de3837fdf6611631af50e9444074ca5d4a55ac926848ff63bc6c2"} Apr 24 21:35:37.708580 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:37.708549 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gtf2w" event={"ID":"44ef511c-205f-488b-a596-eb35096f1fd7","Type":"ContainerStarted","Data":"d2f2f82b367766cf772e7bcc229ff5d54b2578c113057a482b204972f52dbfbc"} Apr 24 21:35:37.725321 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:35:37.725279 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gtf2w" podStartSLOduration=2.050620395 podStartE2EDuration="5.72526633s" podCreationTimestamp="2026-04-24 21:35:32 +0000 UTC" firstStartedPulling="2026-04-24 21:35:33.284662989 +0000 UTC m=+395.246427986" lastFinishedPulling="2026-04-24 21:35:36.959308923 +0000 UTC m=+398.921073921" observedRunningTime="2026-04-24 21:35:37.723654727 +0000 UTC m=+399.685419759" watchObservedRunningTime="2026-04-24 21:35:37.72526633 +0000 UTC m=+399.687031373" Apr 24 21:38:17.660843 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.660813 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-qqbbw"] Apr 24 21:38:17.663851 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.663827 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:17.664573 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.664553 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-t67f6"] Apr 24 21:38:17.666531 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.666511 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-fjjbc\"" Apr 24 21:38:17.667466 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.667443 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:17.667571 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.667546 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:38:17.667571 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.667556 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:38:17.667675 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.667649 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:38:17.670295 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.670267 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:38:17.670421 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.670298 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-dzxw9\"" Apr 24 21:38:17.676723 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.676702 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-qqbbw"] Apr 24 21:38:17.678601 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.678578 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-t67f6"] Apr 24 21:38:17.689971 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.689950 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab67daa-d81e-4186-8df4-327c70f44ca4-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-t67f6\" (UID: \"aab67daa-d81e-4186-8df4-327c70f44ca4\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:17.690076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.689981 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert\") pod \"kserve-controller-manager-67f77cd7d7-qqbbw\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:17.690076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.690002 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26kcq\" (UniqueName: \"kubernetes.io/projected/aab67daa-d81e-4186-8df4-327c70f44ca4-kube-api-access-26kcq\") pod \"llmisvc-controller-manager-68cc5db7c4-t67f6\" (UID: \"aab67daa-d81e-4186-8df4-327c70f44ca4\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:17.690076 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.690030 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5nv6\" (UniqueName: \"kubernetes.io/projected/72a145d0-dd4f-4e10-a6d3-75929b3abc57-kube-api-access-m5nv6\") pod \"kserve-controller-manager-67f77cd7d7-qqbbw\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:17.790451 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.790416 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5nv6\" (UniqueName: \"kubernetes.io/projected/72a145d0-dd4f-4e10-a6d3-75929b3abc57-kube-api-access-m5nv6\") pod \"kserve-controller-manager-67f77cd7d7-qqbbw\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:17.790667 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.790482 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab67daa-d81e-4186-8df4-327c70f44ca4-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-t67f6\" (UID: \"aab67daa-d81e-4186-8df4-327c70f44ca4\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:17.790667 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.790518 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert\") pod \"kserve-controller-manager-67f77cd7d7-qqbbw\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:17.790667 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.790547 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26kcq\" (UniqueName: \"kubernetes.io/projected/aab67daa-d81e-4186-8df4-327c70f44ca4-kube-api-access-26kcq\") pod \"llmisvc-controller-manager-68cc5db7c4-t67f6\" (UID: \"aab67daa-d81e-4186-8df4-327c70f44ca4\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:17.790845 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:38:17.790668 2566 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 24 21:38:17.790845 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:38:17.790734 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert podName:72a145d0-dd4f-4e10-a6d3-75929b3abc57 nodeName:}" failed. No retries permitted until 2026-04-24 21:38:18.290715641 +0000 UTC m=+560.252480647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert") pod "kserve-controller-manager-67f77cd7d7-qqbbw" (UID: "72a145d0-dd4f-4e10-a6d3-75929b3abc57") : secret "kserve-webhook-server-cert" not found Apr 24 21:38:17.793299 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.793275 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab67daa-d81e-4186-8df4-327c70f44ca4-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-t67f6\" (UID: \"aab67daa-d81e-4186-8df4-327c70f44ca4\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:17.804086 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.804060 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5nv6\" (UniqueName: \"kubernetes.io/projected/72a145d0-dd4f-4e10-a6d3-75929b3abc57-kube-api-access-m5nv6\") pod \"kserve-controller-manager-67f77cd7d7-qqbbw\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:17.805329 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.805308 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26kcq\" (UniqueName: \"kubernetes.io/projected/aab67daa-d81e-4186-8df4-327c70f44ca4-kube-api-access-26kcq\") pod \"llmisvc-controller-manager-68cc5db7c4-t67f6\" (UID: \"aab67daa-d81e-4186-8df4-327c70f44ca4\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:17.984074 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:17.983990 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:18.111243 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:18.111219 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-t67f6"] Apr 24 21:38:18.113995 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:38:18.113970 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaab67daa_d81e_4186_8df4_327c70f44ca4.slice/crio-89d54e457bf392717b8ed8e84ca585d9bc69d1bdbdd6ba955bbf33d25f7d09e9 WatchSource:0}: Error finding container 89d54e457bf392717b8ed8e84ca585d9bc69d1bdbdd6ba955bbf33d25f7d09e9: Status 404 returned error can't find the container with id 89d54e457bf392717b8ed8e84ca585d9bc69d1bdbdd6ba955bbf33d25f7d09e9 Apr 24 21:38:18.138092 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:18.138064 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" event={"ID":"aab67daa-d81e-4186-8df4-327c70f44ca4","Type":"ContainerStarted","Data":"89d54e457bf392717b8ed8e84ca585d9bc69d1bdbdd6ba955bbf33d25f7d09e9"} Apr 24 21:38:18.294161 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:18.294131 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert\") pod \"kserve-controller-manager-67f77cd7d7-qqbbw\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:18.296550 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:18.296531 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert\") pod \"kserve-controller-manager-67f77cd7d7-qqbbw\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:18.577801 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:18.577724 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:19.188904 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:19.188642 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-qqbbw"] Apr 24 21:38:19.262416 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:38:19.262381 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a145d0_dd4f_4e10_a6d3_75929b3abc57.slice/crio-ab6a363be3ef81b4bd69ee3e2eb28beae6606f2e7455117850abc06767fac8a9 WatchSource:0}: Error finding container ab6a363be3ef81b4bd69ee3e2eb28beae6606f2e7455117850abc06767fac8a9: Status 404 returned error can't find the container with id ab6a363be3ef81b4bd69ee3e2eb28beae6606f2e7455117850abc06767fac8a9 Apr 24 21:38:20.146144 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:20.146061 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" event={"ID":"aab67daa-d81e-4186-8df4-327c70f44ca4","Type":"ContainerStarted","Data":"1234c55cb61eab6f72aecec3ac343b24e1a92970823f318af94997102738decc"} Apr 24 21:38:20.146298 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:20.146246 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:20.147437 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:20.147400 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" event={"ID":"72a145d0-dd4f-4e10-a6d3-75929b3abc57","Type":"ContainerStarted","Data":"ab6a363be3ef81b4bd69ee3e2eb28beae6606f2e7455117850abc06767fac8a9"} Apr 24 21:38:20.164842 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:20.164800 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" podStartSLOduration=1.44420676 podStartE2EDuration="3.164787332s" podCreationTimestamp="2026-04-24 21:38:17 +0000 UTC" firstStartedPulling="2026-04-24 21:38:18.115217881 +0000 UTC m=+560.076982893" lastFinishedPulling="2026-04-24 21:38:19.835798452 +0000 UTC m=+561.797563465" observedRunningTime="2026-04-24 21:38:20.164410014 +0000 UTC m=+562.126175031" watchObservedRunningTime="2026-04-24 21:38:20.164787332 +0000 UTC m=+562.126552364" Apr 24 21:38:22.154054 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:22.153971 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" event={"ID":"72a145d0-dd4f-4e10-a6d3-75929b3abc57","Type":"ContainerStarted","Data":"3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2"} Apr 24 21:38:22.154404 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:22.154083 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:22.174392 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:22.174329 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" podStartSLOduration=2.536070794 podStartE2EDuration="5.174318512s" podCreationTimestamp="2026-04-24 21:38:17 +0000 UTC" firstStartedPulling="2026-04-24 21:38:19.2639786 +0000 UTC m=+561.225743599" lastFinishedPulling="2026-04-24 21:38:21.902226319 +0000 UTC m=+563.863991317" observedRunningTime="2026-04-24 21:38:22.1726574 +0000 UTC m=+564.134422425" watchObservedRunningTime="2026-04-24 21:38:22.174318512 +0000 UTC m=+564.136083532" Apr 24 21:38:51.152616 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:51.152585 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-t67f6" Apr 24 21:38:52.459397 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.459362 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-qqbbw"] Apr 24 21:38:52.459793 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.459611 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" podUID="72a145d0-dd4f-4e10-a6d3-75929b3abc57" containerName="manager" containerID="cri-o://3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2" gracePeriod=10 Apr 24 21:38:52.464515 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.464487 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:52.487891 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.487870 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-csgss"] Apr 24 21:38:52.490933 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.490916 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:52.501177 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.501152 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-csgss"] Apr 24 21:38:52.512169 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.512145 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5-cert\") pod \"kserve-controller-manager-67f77cd7d7-csgss\" (UID: \"dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5\") " pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:52.512276 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.512197 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgncn\" (UniqueName: \"kubernetes.io/projected/dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5-kube-api-access-bgncn\") pod \"kserve-controller-manager-67f77cd7d7-csgss\" (UID: \"dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5\") " pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:52.612890 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.612860 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5-cert\") pod \"kserve-controller-manager-67f77cd7d7-csgss\" (UID: \"dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5\") " pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:52.613022 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.612903 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgncn\" (UniqueName: \"kubernetes.io/projected/dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5-kube-api-access-bgncn\") pod \"kserve-controller-manager-67f77cd7d7-csgss\" (UID: \"dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5\") " pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:52.615420 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.615391 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5-cert\") pod \"kserve-controller-manager-67f77cd7d7-csgss\" (UID: \"dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5\") " pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:52.623865 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.623628 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgncn\" (UniqueName: \"kubernetes.io/projected/dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5-kube-api-access-bgncn\") pod \"kserve-controller-manager-67f77cd7d7-csgss\" (UID: \"dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5\") " pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:52.697540 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.697518 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:52.713723 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.713675 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert\") pod \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " Apr 24 21:38:52.713825 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.713739 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5nv6\" (UniqueName: \"kubernetes.io/projected/72a145d0-dd4f-4e10-a6d3-75929b3abc57-kube-api-access-m5nv6\") pod \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\" (UID: \"72a145d0-dd4f-4e10-a6d3-75929b3abc57\") " Apr 24 21:38:52.715971 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.715942 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert" (OuterVolumeSpecName: "cert") pod "72a145d0-dd4f-4e10-a6d3-75929b3abc57" (UID: "72a145d0-dd4f-4e10-a6d3-75929b3abc57"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:38:52.716075 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.716051 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a145d0-dd4f-4e10-a6d3-75929b3abc57-kube-api-access-m5nv6" (OuterVolumeSpecName: "kube-api-access-m5nv6") pod "72a145d0-dd4f-4e10-a6d3-75929b3abc57" (UID: "72a145d0-dd4f-4e10-a6d3-75929b3abc57"). InnerVolumeSpecName "kube-api-access-m5nv6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:38:52.814952 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.814927 2566 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a145d0-dd4f-4e10-a6d3-75929b3abc57-cert\") on node \"ip-10-0-135-27.ec2.internal\" DevicePath \"\"" Apr 24 21:38:52.814952 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.814952 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5nv6\" (UniqueName: \"kubernetes.io/projected/72a145d0-dd4f-4e10-a6d3-75929b3abc57-kube-api-access-m5nv6\") on node \"ip-10-0-135-27.ec2.internal\" DevicePath \"\"" Apr 24 21:38:52.846517 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:52.846484 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:53.167571 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.167537 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-csgss"] Apr 24 21:38:53.170400 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:38:53.170374 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd0b7c6_e4c2_4b40_b21e_c57a4e6b73f5.slice/crio-bda7bce4cab8f0b0c84ae16102fd440b7a6ee1b4c3eabb3d1bcf6c30c4f2268e WatchSource:0}: Error finding container bda7bce4cab8f0b0c84ae16102fd440b7a6ee1b4c3eabb3d1bcf6c30c4f2268e: Status 404 returned error can't find the container with id bda7bce4cab8f0b0c84ae16102fd440b7a6ee1b4c3eabb3d1bcf6c30c4f2268e Apr 24 21:38:53.236573 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.236539 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" event={"ID":"dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5","Type":"ContainerStarted","Data":"bda7bce4cab8f0b0c84ae16102fd440b7a6ee1b4c3eabb3d1bcf6c30c4f2268e"} Apr 24 21:38:53.237630 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.237602 2566 generic.go:358] "Generic (PLEG): container finished" podID="72a145d0-dd4f-4e10-a6d3-75929b3abc57" containerID="3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2" exitCode=0 Apr 24 21:38:53.237725 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.237639 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" event={"ID":"72a145d0-dd4f-4e10-a6d3-75929b3abc57","Type":"ContainerDied","Data":"3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2"} Apr 24 21:38:53.237725 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.237664 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" event={"ID":"72a145d0-dd4f-4e10-a6d3-75929b3abc57","Type":"ContainerDied","Data":"ab6a363be3ef81b4bd69ee3e2eb28beae6606f2e7455117850abc06767fac8a9"} Apr 24 21:38:53.237725 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.237676 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-qqbbw" Apr 24 21:38:53.237838 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.237679 2566 scope.go:117] "RemoveContainer" containerID="3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2" Apr 24 21:38:53.245566 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.245550 2566 scope.go:117] "RemoveContainer" containerID="3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2" Apr 24 21:38:53.245826 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:38:53.245806 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2\": container with ID starting with 3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2 not found: ID does not exist" containerID="3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2" Apr 24 21:38:53.245870 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.245833 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2"} err="failed to get container status \"3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2\": rpc error: code = NotFound desc = could not find container \"3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2\": container with ID starting with 3ae90f2ba61c432a7fa0e7ed753f31821d2488f80a109a6c46f0d9ce79fbabe2 not found: ID does not exist" Apr 24 21:38:53.259608 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.259577 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-qqbbw"] Apr 24 21:38:53.264537 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:53.264513 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-qqbbw"] Apr 24 21:38:54.241914 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:54.241880 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" event={"ID":"dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5","Type":"ContainerStarted","Data":"2305ce4bbab84fb702fdf8bf7d3057c1b44a70f3a55178ab22843fdca495f4c6"} Apr 24 21:38:54.242280 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:54.241941 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:38:54.261632 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:54.261579 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" podStartSLOduration=1.840946673 podStartE2EDuration="2.261562529s" podCreationTimestamp="2026-04-24 21:38:52 +0000 UTC" firstStartedPulling="2026-04-24 21:38:53.171600879 +0000 UTC m=+595.133365877" lastFinishedPulling="2026-04-24 21:38:53.592216735 +0000 UTC m=+595.553981733" observedRunningTime="2026-04-24 21:38:54.260293289 +0000 UTC m=+596.222058308" watchObservedRunningTime="2026-04-24 21:38:54.261562529 +0000 UTC m=+596.223327550" Apr 24 21:38:54.538540 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:54.538508 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a145d0-dd4f-4e10-a6d3-75929b3abc57" path="/var/lib/kubelet/pods/72a145d0-dd4f-4e10-a6d3-75929b3abc57/volumes" Apr 24 21:38:58.450428 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:58.450406 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:38:58.451071 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:58.451055 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:38:58.456411 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:58.456390 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:38:58.456778 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:38:58.456760 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:39:25.248903 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:25.248874 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-67f77cd7d7-csgss" Apr 24 21:39:26.153197 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.153160 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-krnwt"] Apr 24 21:39:26.153664 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.153645 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72a145d0-dd4f-4e10-a6d3-75929b3abc57" containerName="manager" Apr 24 21:39:26.153664 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.153665 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a145d0-dd4f-4e10-a6d3-75929b3abc57" containerName="manager" Apr 24 21:39:26.153803 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.153749 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="72a145d0-dd4f-4e10-a6d3-75929b3abc57" containerName="manager" Apr 24 21:39:26.157033 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.157013 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:26.160743 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.160715 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:39:26.161174 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.161156 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-rz5gw\"" Apr 24 21:39:26.169994 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.169973 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-krnwt"] Apr 24 21:39:26.251499 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.251473 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-tls-certs\") pod \"model-serving-api-86f7b4b499-krnwt\" (UID: \"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2\") " pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:26.251828 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.251520 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqjf\" (UniqueName: \"kubernetes.io/projected/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-kube-api-access-4lqjf\") pod \"model-serving-api-86f7b4b499-krnwt\" (UID: \"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2\") " pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:26.352739 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.352707 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqjf\" (UniqueName: \"kubernetes.io/projected/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-kube-api-access-4lqjf\") pod \"model-serving-api-86f7b4b499-krnwt\" (UID: \"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2\") " pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:26.352886 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.352760 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-tls-certs\") pod \"model-serving-api-86f7b4b499-krnwt\" (UID: \"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2\") " pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:26.352886 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:39:26.352855 2566 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 21:39:26.352959 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:39:26.352913 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-tls-certs podName:d87e41ba-2228-4bdc-b1e8-13a4b8445ed2 nodeName:}" failed. No retries permitted until 2026-04-24 21:39:26.852898478 +0000 UTC m=+628.814663477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-tls-certs") pod "model-serving-api-86f7b4b499-krnwt" (UID: "d87e41ba-2228-4bdc-b1e8-13a4b8445ed2") : secret "model-serving-api-tls" not found Apr 24 21:39:26.362462 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.362432 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqjf\" (UniqueName: \"kubernetes.io/projected/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-kube-api-access-4lqjf\") pod \"model-serving-api-86f7b4b499-krnwt\" (UID: \"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2\") " pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:26.855797 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.855756 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-tls-certs\") pod \"model-serving-api-86f7b4b499-krnwt\" (UID: \"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2\") " pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:26.857931 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:26.857912 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d87e41ba-2228-4bdc-b1e8-13a4b8445ed2-tls-certs\") pod \"model-serving-api-86f7b4b499-krnwt\" (UID: \"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2\") " pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:27.069709 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:27.069669 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:27.189701 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:27.189672 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-krnwt"] Apr 24 21:39:27.192582 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:39:27.192559 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd87e41ba_2228_4bdc_b1e8_13a4b8445ed2.slice/crio-930fb0ce2aa2aa50a88472a7f588cec9a66f98e254e5ec88e8cd52a45cac659a WatchSource:0}: Error finding container 930fb0ce2aa2aa50a88472a7f588cec9a66f98e254e5ec88e8cd52a45cac659a: Status 404 returned error can't find the container with id 930fb0ce2aa2aa50a88472a7f588cec9a66f98e254e5ec88e8cd52a45cac659a Apr 24 21:39:27.327530 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:27.327502 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-krnwt" event={"ID":"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2","Type":"ContainerStarted","Data":"930fb0ce2aa2aa50a88472a7f588cec9a66f98e254e5ec88e8cd52a45cac659a"} Apr 24 21:39:29.334396 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:29.334343 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-krnwt" event={"ID":"d87e41ba-2228-4bdc-b1e8-13a4b8445ed2","Type":"ContainerStarted","Data":"4c2bbbc366e98d8648b0a31e5d0386446c6360330e2b613102c571b7d398c92d"} Apr 24 21:39:29.334810 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:29.334476 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:39:29.357850 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:29.357808 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-krnwt" podStartSLOduration=2.2304073620000002 podStartE2EDuration="3.357795631s" podCreationTimestamp="2026-04-24 21:39:26 +0000 UTC" firstStartedPulling="2026-04-24 21:39:27.194340548 +0000 UTC m=+629.156105549" lastFinishedPulling="2026-04-24 21:39:28.32172882 +0000 UTC m=+630.283493818" observedRunningTime="2026-04-24 21:39:29.356293334 +0000 UTC m=+631.318058355" watchObservedRunningTime="2026-04-24 21:39:29.357795631 +0000 UTC m=+631.319560650" Apr 24 21:39:40.340973 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:39:40.340944 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-krnwt" Apr 24 21:43:16.104633 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.104602 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk"] Apr 24 21:43:16.107691 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.107675 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:16.110025 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.110003 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-62689-serving-cert\"" Apr 24 21:43:16.110204 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.110186 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-mq7qd\"" Apr 24 21:43:16.110369 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.110329 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-62689-kube-rbac-proxy-sar-config\"" Apr 24 21:43:16.110985 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.110968 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:43:16.116604 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.116582 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk"] Apr 24 21:43:16.262089 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.262058 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82ed1038-5d08-47ce-a138-1034c8384dec-openshift-service-ca-bundle\") pod \"model-chainer-raw-62689-7d88bd74c9-ql9vk\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:16.262218 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.262109 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls\") pod \"model-chainer-raw-62689-7d88bd74c9-ql9vk\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:16.363270 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.363180 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82ed1038-5d08-47ce-a138-1034c8384dec-openshift-service-ca-bundle\") pod \"model-chainer-raw-62689-7d88bd74c9-ql9vk\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:16.363270 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.363245 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls\") pod \"model-chainer-raw-62689-7d88bd74c9-ql9vk\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:16.363473 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:16.363338 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-62689-serving-cert: secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:16.363473 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:16.363439 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls podName:82ed1038-5d08-47ce-a138-1034c8384dec nodeName:}" failed. No retries permitted until 2026-04-24 21:43:16.863415355 +0000 UTC m=+858.825180355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls") pod "model-chainer-raw-62689-7d88bd74c9-ql9vk" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec") : secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:16.363908 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.363891 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82ed1038-5d08-47ce-a138-1034c8384dec-openshift-service-ca-bundle\") pod \"model-chainer-raw-62689-7d88bd74c9-ql9vk\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:16.867492 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.867461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls\") pod \"model-chainer-raw-62689-7d88bd74c9-ql9vk\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:16.869720 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:16.869692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls\") pod \"model-chainer-raw-62689-7d88bd74c9-ql9vk\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:17.019551 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:17.019516 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:17.141780 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:17.141754 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk"] Apr 24 21:43:17.144365 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:43:17.144325 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ed1038_5d08_47ce_a138_1034c8384dec.slice/crio-e1d0e2dab50da1526435d89d8d2b15dfb190a0ef6bcc755fdb1432702219b68e WatchSource:0}: Error finding container e1d0e2dab50da1526435d89d8d2b15dfb190a0ef6bcc755fdb1432702219b68e: Status 404 returned error can't find the container with id e1d0e2dab50da1526435d89d8d2b15dfb190a0ef6bcc755fdb1432702219b68e Apr 24 21:43:17.147919 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:17.147901 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:43:17.959732 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:17.959695 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" event={"ID":"82ed1038-5d08-47ce-a138-1034c8384dec","Type":"ContainerStarted","Data":"e1d0e2dab50da1526435d89d8d2b15dfb190a0ef6bcc755fdb1432702219b68e"} Apr 24 21:43:19.967066 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:19.967031 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" event={"ID":"82ed1038-5d08-47ce-a138-1034c8384dec","Type":"ContainerStarted","Data":"1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2"} Apr 24 21:43:19.967495 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:19.967086 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:19.987848 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:19.987804 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" podStartSLOduration=1.893631007 podStartE2EDuration="3.987790736s" podCreationTimestamp="2026-04-24 21:43:16 +0000 UTC" firstStartedPulling="2026-04-24 21:43:17.148038681 +0000 UTC m=+859.109803678" lastFinishedPulling="2026-04-24 21:43:19.242198189 +0000 UTC m=+861.203963407" observedRunningTime="2026-04-24 21:43:19.986744902 +0000 UTC m=+861.948509923" watchObservedRunningTime="2026-04-24 21:43:19.987790736 +0000 UTC m=+861.949555786" Apr 24 21:43:25.975544 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:25.975513 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:26.137389 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:26.137343 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-62689-serving-cert: secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:26.137544 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:26.137422 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls podName:82ed1038-5d08-47ce-a138-1034c8384dec nodeName:}" failed. No retries permitted until 2026-04-24 21:43:26.637405583 +0000 UTC m=+868.599170582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls") pod "model-chainer-raw-62689-7d88bd74c9-ql9vk" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec") : secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:26.141406 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:26.141379 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk"] Apr 24 21:43:26.141656 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:26.141631 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" containerID="cri-o://1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2" gracePeriod=30 Apr 24 21:43:26.642710 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:26.642672 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-62689-serving-cert: secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:26.642893 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:26.642754 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls podName:82ed1038-5d08-47ce-a138-1034c8384dec nodeName:}" failed. No retries permitted until 2026-04-24 21:43:27.642736199 +0000 UTC m=+869.604501213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls") pod "model-chainer-raw-62689-7d88bd74c9-ql9vk" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec") : secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:27.649641 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:27.649609 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-62689-serving-cert: secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:27.649993 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:27.649675 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls podName:82ed1038-5d08-47ce-a138-1034c8384dec nodeName:}" failed. No retries permitted until 2026-04-24 21:43:29.649661439 +0000 UTC m=+871.611426442 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls") pod "model-chainer-raw-62689-7d88bd74c9-ql9vk" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec") : secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:29.665286 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:29.665248 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-62689-serving-cert: secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:29.665660 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:29.665326 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls podName:82ed1038-5d08-47ce-a138-1034c8384dec nodeName:}" failed. No retries permitted until 2026-04-24 21:43:33.665309534 +0000 UTC m=+875.627074533 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls") pod "model-chainer-raw-62689-7d88bd74c9-ql9vk" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec") : secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:30.974346 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:30.974286 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:43:33.694716 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:33.694686 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-62689-serving-cert: secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:33.695098 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:33.694768 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls podName:82ed1038-5d08-47ce-a138-1034c8384dec nodeName:}" failed. No retries permitted until 2026-04-24 21:43:41.694750441 +0000 UTC m=+883.656515438 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls") pod "model-chainer-raw-62689-7d88bd74c9-ql9vk" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec") : secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:35.974055 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:35.974011 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:43:40.974028 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:40.973941 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:43:40.974466 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:40.974068 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:41.753564 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:41.753533 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-62689-serving-cert: secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:41.753718 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:41.753601 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls podName:82ed1038-5d08-47ce-a138-1034c8384dec nodeName:}" failed. No retries permitted until 2026-04-24 21:43:57.753584085 +0000 UTC m=+899.715349299 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls") pod "model-chainer-raw-62689-7d88bd74c9-ql9vk" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec") : secret "model-chainer-raw-62689-serving-cert" not found Apr 24 21:43:45.974219 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:45.974179 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:43:50.974565 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:50.974526 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:43:55.973933 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:55.973890 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:43:56.280756 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:56.280733 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:56.358529 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:56.358493 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82ed1038-5d08-47ce-a138-1034c8384dec-openshift-service-ca-bundle\") pod \"82ed1038-5d08-47ce-a138-1034c8384dec\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " Apr 24 21:43:56.358666 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:56.358570 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls\") pod \"82ed1038-5d08-47ce-a138-1034c8384dec\" (UID: \"82ed1038-5d08-47ce-a138-1034c8384dec\") " Apr 24 21:43:56.358845 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:56.358820 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ed1038-5d08-47ce-a138-1034c8384dec-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "82ed1038-5d08-47ce-a138-1034c8384dec" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:43:56.360614 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:56.360588 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "82ed1038-5d08-47ce-a138-1034c8384dec" (UID: "82ed1038-5d08-47ce-a138-1034c8384dec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:56.459224 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:56.459199 2566 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82ed1038-5d08-47ce-a138-1034c8384dec-openshift-service-ca-bundle\") on node \"ip-10-0-135-27.ec2.internal\" DevicePath \"\"" Apr 24 21:43:56.459224 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:56.459221 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82ed1038-5d08-47ce-a138-1034c8384dec-proxy-tls\") on node \"ip-10-0-135-27.ec2.internal\" DevicePath \"\"" Apr 24 21:43:57.069244 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.069202 2566 generic.go:358] "Generic (PLEG): container finished" podID="82ed1038-5d08-47ce-a138-1034c8384dec" containerID="1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2" exitCode=0 Apr 24 21:43:57.069620 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.069283 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" event={"ID":"82ed1038-5d08-47ce-a138-1034c8384dec","Type":"ContainerDied","Data":"1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2"} Apr 24 21:43:57.069620 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.069317 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" event={"ID":"82ed1038-5d08-47ce-a138-1034c8384dec","Type":"ContainerDied","Data":"e1d0e2dab50da1526435d89d8d2b15dfb190a0ef6bcc755fdb1432702219b68e"} Apr 24 21:43:57.069620 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.069332 2566 scope.go:117] "RemoveContainer" containerID="1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2" Apr 24 21:43:57.069620 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.069292 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk" Apr 24 21:43:57.077201 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.077185 2566 scope.go:117] "RemoveContainer" containerID="1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2" Apr 24 21:43:57.077468 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:43:57.077438 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2\": container with ID starting with 1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2 not found: ID does not exist" containerID="1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2" Apr 24 21:43:57.077528 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.077478 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2"} err="failed to get container status \"1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2\": rpc error: code = NotFound desc = could not find container \"1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2\": container with ID starting with 1cec759e4b4414956c44a4510c9065e6a3bb4c3efd0e9a379949e124947d57f2 not found: ID does not exist" Apr 24 21:43:57.089041 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.089019 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk"] Apr 24 21:43:57.095111 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:57.095091 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-62689-7d88bd74c9-ql9vk"] Apr 24 21:43:58.469662 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:58.469635 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:43:58.470711 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:58.470684 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:43:58.476514 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:58.476493 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:43:58.477203 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:58.477188 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:43:58.539753 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:43:58.539729 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" path="/var/lib/kubelet/pods/82ed1038-5d08-47ce-a138-1034c8384dec/volumes" Apr 24 21:44:56.390825 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.385178 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt"] Apr 24 21:44:56.390825 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.385914 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" Apr 24 21:44:56.390825 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.385936 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" Apr 24 21:44:56.390825 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.386111 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="82ed1038-5d08-47ce-a138-1034c8384dec" containerName="model-chainer-raw-62689" Apr 24 21:44:56.390825 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.389589 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:56.392756 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.392733 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-mq7qd\"" Apr 24 21:44:56.393147 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.393131 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:44:56.393669 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.393652 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-2b4f4-serving-cert\"" Apr 24 21:44:56.393760 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.393693 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\"" Apr 24 21:44:56.399325 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.399304 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt"] Apr 24 21:44:56.495360 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.495328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08f8b7-f188-4003-9a25-93f7161c3fc7-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:56.495500 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.495385 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls\") pod \"model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:56.595929 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.595896 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08f8b7-f188-4003-9a25-93f7161c3fc7-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:56.596079 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.595937 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls\") pod \"model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:56.596079 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:44:56.596060 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-serving-cert: secret "model-chainer-raw-hpa-2b4f4-serving-cert" not found Apr 24 21:44:56.596195 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:44:56.596142 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls podName:de08f8b7-f188-4003-9a25-93f7161c3fc7 nodeName:}" failed. No retries permitted until 2026-04-24 21:44:57.096121542 +0000 UTC m=+959.057886546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls") pod "model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" (UID: "de08f8b7-f188-4003-9a25-93f7161c3fc7") : secret "model-chainer-raw-hpa-2b4f4-serving-cert" not found Apr 24 21:44:56.596567 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:56.596548 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08f8b7-f188-4003-9a25-93f7161c3fc7-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:57.100384 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:57.100333 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls\") pod \"model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:57.102749 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:57.102726 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls\") pod \"model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:57.301857 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:57.301826 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:44:57.418722 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:57.418695 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt"] Apr 24 21:44:57.421133 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:44:57.421095 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde08f8b7_f188_4003_9a25_93f7161c3fc7.slice/crio-2f1a66732a464ccfd2a2be70110ab62fdda34487925877387c050b8fb7e6517b WatchSource:0}: Error finding container 2f1a66732a464ccfd2a2be70110ab62fdda34487925877387c050b8fb7e6517b: Status 404 returned error can't find the container with id 2f1a66732a464ccfd2a2be70110ab62fdda34487925877387c050b8fb7e6517b Apr 24 21:44:58.237053 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:58.237017 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" event={"ID":"de08f8b7-f188-4003-9a25-93f7161c3fc7","Type":"ContainerStarted","Data":"8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557"} Apr 24 21:44:58.237053 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:58.237057 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" event={"ID":"de08f8b7-f188-4003-9a25-93f7161c3fc7","Type":"ContainerStarted","Data":"2f1a66732a464ccfd2a2be70110ab62fdda34487925877387c050b8fb7e6517b"} Apr 24 21:44:58.237248 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:44:58.237142 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:45:04.246321 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:04.246292 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:45:04.264761 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:04.264720 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" podStartSLOduration=8.264709123 podStartE2EDuration="8.264709123s" podCreationTimestamp="2026-04-24 21:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:44:58.256796501 +0000 UTC m=+960.218561532" watchObservedRunningTime="2026-04-24 21:45:04.264709123 +0000 UTC m=+966.226474143" Apr 24 21:45:06.440968 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:06.440935 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt"] Apr 24 21:45:06.441427 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:06.441161 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" containerID="cri-o://8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557" gracePeriod=30 Apr 24 21:45:09.244400 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:09.244297 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:14.244627 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:14.244584 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:19.244187 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:19.244146 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:19.244625 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:19.244267 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:45:24.243917 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:24.243876 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:29.244792 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:29.244754 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:34.244727 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:34.244690 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:36.627090 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:36.627068 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:45:36.683523 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:36.683498 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08f8b7-f188-4003-9a25-93f7161c3fc7-openshift-service-ca-bundle\") pod \"de08f8b7-f188-4003-9a25-93f7161c3fc7\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " Apr 24 21:45:36.683650 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:36.683582 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls\") pod \"de08f8b7-f188-4003-9a25-93f7161c3fc7\" (UID: \"de08f8b7-f188-4003-9a25-93f7161c3fc7\") " Apr 24 21:45:36.683825 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:36.683802 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de08f8b7-f188-4003-9a25-93f7161c3fc7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "de08f8b7-f188-4003-9a25-93f7161c3fc7" (UID: "de08f8b7-f188-4003-9a25-93f7161c3fc7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:36.685607 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:36.685580 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "de08f8b7-f188-4003-9a25-93f7161c3fc7" (UID: "de08f8b7-f188-4003-9a25-93f7161c3fc7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:36.784101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:36.784066 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de08f8b7-f188-4003-9a25-93f7161c3fc7-proxy-tls\") on node \"ip-10-0-135-27.ec2.internal\" DevicePath \"\"" Apr 24 21:45:36.784101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:36.784099 2566 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08f8b7-f188-4003-9a25-93f7161c3fc7-openshift-service-ca-bundle\") on node \"ip-10-0-135-27.ec2.internal\" DevicePath \"\"" Apr 24 21:45:37.345244 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.345210 2566 generic.go:358] "Generic (PLEG): container finished" podID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerID="8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557" exitCode=137 Apr 24 21:45:37.345464 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.345265 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" Apr 24 21:45:37.345464 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.345277 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" event={"ID":"de08f8b7-f188-4003-9a25-93f7161c3fc7","Type":"ContainerDied","Data":"8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557"} Apr 24 21:45:37.345464 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.345312 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt" event={"ID":"de08f8b7-f188-4003-9a25-93f7161c3fc7","Type":"ContainerDied","Data":"2f1a66732a464ccfd2a2be70110ab62fdda34487925877387c050b8fb7e6517b"} Apr 24 21:45:37.345464 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.345331 2566 scope.go:117] "RemoveContainer" containerID="8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557" Apr 24 21:45:37.353179 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.353158 2566 scope.go:117] "RemoveContainer" containerID="8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557" Apr 24 21:45:37.353453 ip-10-0-135-27 kubenswrapper[2566]: E0424 21:45:37.353424 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557\": container with ID starting with 8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557 not found: ID does not exist" containerID="8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557" Apr 24 21:45:37.353508 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.353463 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557"} err="failed to get container status \"8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557\": rpc error: code = NotFound desc = could not find container \"8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557\": container with ID starting with 8a70305191f110de2aa174c31da4215da3e3e74827145e4ebe64f9b5083fe557 not found: ID does not exist" Apr 24 21:45:37.366784 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.366759 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt"] Apr 24 21:45:37.371453 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:37.371432 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2b4f4-5cbd8c8895-24wlt"] Apr 24 21:45:38.539024 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:45:38.538991 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" path="/var/lib/kubelet/pods/de08f8b7-f188-4003-9a25-93f7161c3fc7/volumes" Apr 24 21:48:58.488978 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:48:58.488950 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:48:58.490897 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:48:58.490871 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:48:58.494938 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:48:58.494916 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:48:58.496856 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:48:58.496838 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:53:56.651595 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.651561 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fdcwc/must-gather-zgrx4"] Apr 24 21:53:56.651988 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.651844 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" Apr 24 21:53:56.651988 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.651855 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" Apr 24 21:53:56.651988 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.651909 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="de08f8b7-f188-4003-9a25-93f7161c3fc7" containerName="model-chainer-raw-hpa-2b4f4" Apr 24 21:53:56.654820 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.654805 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdcwc/must-gather-zgrx4" Apr 24 21:53:56.657163 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.657141 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fdcwc\"/\"kube-root-ca.crt\"" Apr 24 21:53:56.657280 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.657146 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fdcwc\"/\"default-dockercfg-gd25x\"" Apr 24 21:53:56.657280 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.657213 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fdcwc\"/\"openshift-service-ca.crt\"" Apr 24 21:53:56.661756 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.661706 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdcwc/must-gather-zgrx4"] Apr 24 21:53:56.720929 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.720898 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f1a89aee-79a9-4b89-92a7-22819508ca87-must-gather-output\") pod \"must-gather-zgrx4\" (UID: \"f1a89aee-79a9-4b89-92a7-22819508ca87\") " pod="openshift-must-gather-fdcwc/must-gather-zgrx4" Apr 24 21:53:56.721058 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.720945 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prhc4\" (UniqueName: \"kubernetes.io/projected/f1a89aee-79a9-4b89-92a7-22819508ca87-kube-api-access-prhc4\") pod \"must-gather-zgrx4\" (UID: \"f1a89aee-79a9-4b89-92a7-22819508ca87\") " pod="openshift-must-gather-fdcwc/must-gather-zgrx4" Apr 24 21:53:56.821313 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.821286 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f1a89aee-79a9-4b89-92a7-22819508ca87-must-gather-output\") pod \"must-gather-zgrx4\" (UID: \"f1a89aee-79a9-4b89-92a7-22819508ca87\") " pod="openshift-must-gather-fdcwc/must-gather-zgrx4" Apr 24 21:53:56.821471 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.821329 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prhc4\" (UniqueName: \"kubernetes.io/projected/f1a89aee-79a9-4b89-92a7-22819508ca87-kube-api-access-prhc4\") pod \"must-gather-zgrx4\" (UID: \"f1a89aee-79a9-4b89-92a7-22819508ca87\") " pod="openshift-must-gather-fdcwc/must-gather-zgrx4" Apr 24 21:53:56.821687 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.821665 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f1a89aee-79a9-4b89-92a7-22819508ca87-must-gather-output\") pod \"must-gather-zgrx4\" (UID: \"f1a89aee-79a9-4b89-92a7-22819508ca87\") " pod="openshift-must-gather-fdcwc/must-gather-zgrx4" Apr 24 21:53:56.830190 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.830171 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prhc4\" (UniqueName: \"kubernetes.io/projected/f1a89aee-79a9-4b89-92a7-22819508ca87-kube-api-access-prhc4\") pod \"must-gather-zgrx4\" (UID: \"f1a89aee-79a9-4b89-92a7-22819508ca87\") " pod="openshift-must-gather-fdcwc/must-gather-zgrx4" Apr 24 21:53:56.964264 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:56.964203 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdcwc/must-gather-zgrx4" Apr 24 21:53:57.079095 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:57.079069 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdcwc/must-gather-zgrx4"] Apr 24 21:53:57.081226 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:53:57.081200 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1a89aee_79a9_4b89_92a7_22819508ca87.slice/crio-986cc47e1830c31b13b834a113b874f55920dd9d35a8c4ab56411c10dd816492 WatchSource:0}: Error finding container 986cc47e1830c31b13b834a113b874f55920dd9d35a8c4ab56411c10dd816492: Status 404 returned error can't find the container with id 986cc47e1830c31b13b834a113b874f55920dd9d35a8c4ab56411c10dd816492 Apr 24 21:53:57.083070 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:57.083049 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:53:57.708148 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:57.708100 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/must-gather-zgrx4" event={"ID":"f1a89aee-79a9-4b89-92a7-22819508ca87","Type":"ContainerStarted","Data":"986cc47e1830c31b13b834a113b874f55920dd9d35a8c4ab56411c10dd816492"} Apr 24 21:53:58.515994 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:58.515962 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:53:58.516435 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:58.516408 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:53:58.523311 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:58.523288 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:53:58.524099 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:58.524079 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:53:58.713193 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:58.713139 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/must-gather-zgrx4" event={"ID":"f1a89aee-79a9-4b89-92a7-22819508ca87","Type":"ContainerStarted","Data":"880f46c6a4012b7d3d16b19f071967bab968324c507818b960aa16ad6ef1552b"} Apr 24 21:53:58.713193 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:58.713191 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/must-gather-zgrx4" event={"ID":"f1a89aee-79a9-4b89-92a7-22819508ca87","Type":"ContainerStarted","Data":"dfba3511d01d9d7f93a76533eb1433b5bb04fd916a64ab31af074b0955cb9b2e"} Apr 24 21:53:58.730494 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:58.730437 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fdcwc/must-gather-zgrx4" podStartSLOduration=2.026136278 podStartE2EDuration="2.73041806s" podCreationTimestamp="2026-04-24 21:53:56 +0000 UTC" firstStartedPulling="2026-04-24 21:53:57.083206945 +0000 UTC m=+1499.044971943" lastFinishedPulling="2026-04-24 21:53:57.787488716 +0000 UTC m=+1499.749253725" observedRunningTime="2026-04-24 21:53:58.729163167 +0000 UTC m=+1500.690928187" watchObservedRunningTime="2026-04-24 21:53:58.73041806 +0000 UTC m=+1500.692183105" Apr 24 21:53:59.256016 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:59.255985 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gtf2w_44ef511c-205f-488b-a596-eb35096f1fd7/global-pull-secret-syncer/0.log" Apr 24 21:53:59.362471 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:59.362434 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2qsqx_e9028851-049a-4814-809a-8ffbb08d8ce7/konnectivity-agent/0.log" Apr 24 21:53:59.460751 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:53:59.460717 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-27.ec2.internal_737b9e94dfd820d8429e803872b0624c/haproxy/0.log" Apr 24 21:54:03.052948 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.052919 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vhvw5_7403fa15-7ed1-496e-83fd-fd72a2f75042/node-exporter/0.log" Apr 24 21:54:03.081511 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.081480 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vhvw5_7403fa15-7ed1-496e-83fd-fd72a2f75042/kube-rbac-proxy/0.log" Apr 24 21:54:03.110619 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.110594 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vhvw5_7403fa15-7ed1-496e-83fd-fd72a2f75042/init-textfile/0.log" Apr 24 21:54:03.581215 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.581186 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fb4645959-rscd8_b62f681b-275b-4e16-b606-abc9771b4539/thanos-query/0.log" Apr 24 21:54:03.609101 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.609072 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fb4645959-rscd8_b62f681b-275b-4e16-b606-abc9771b4539/kube-rbac-proxy-web/0.log" Apr 24 21:54:03.636118 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.636090 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fb4645959-rscd8_b62f681b-275b-4e16-b606-abc9771b4539/kube-rbac-proxy/0.log" Apr 24 21:54:03.661846 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.661821 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fb4645959-rscd8_b62f681b-275b-4e16-b606-abc9771b4539/prom-label-proxy/0.log" Apr 24 21:54:03.686467 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.686438 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fb4645959-rscd8_b62f681b-275b-4e16-b606-abc9771b4539/kube-rbac-proxy-rules/0.log" Apr 24 21:54:03.715679 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:03.715648 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fb4645959-rscd8_b62f681b-275b-4e16-b606-abc9771b4539/kube-rbac-proxy-metrics/0.log" Apr 24 21:54:05.324750 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:05.324717 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/2.log" Apr 24 21:54:05.330220 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:05.330199 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rwc2z_5da9727b-1328-4d04-8ab2-cda650280a23/console-operator/3.log" Apr 24 21:54:06.007554 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.007517 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd"] Apr 24 21:54:06.012239 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.012211 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.018956 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.018933 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd"] Apr 24 21:54:06.105035 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.104999 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-lib-modules\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.105205 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.105060 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-proc\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.105205 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.105152 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-sys\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.105205 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.105189 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmgxj\" (UniqueName: \"kubernetes.io/projected/be3de2e0-a404-4801-ad4a-3190a154c69d-kube-api-access-kmgxj\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.105416 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.105230 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-podres\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206295 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206260 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-sys\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206482 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206309 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmgxj\" (UniqueName: \"kubernetes.io/projected/be3de2e0-a404-4801-ad4a-3190a154c69d-kube-api-access-kmgxj\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206482 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206408 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-sys\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206482 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206407 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-podres\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206708 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206523 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-lib-modules\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206708 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-podres\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206708 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206575 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-proc\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206708 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206619 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-lib-modules\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.206708 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.206646 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/be3de2e0-a404-4801-ad4a-3190a154c69d-proc\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.215604 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.215582 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmgxj\" (UniqueName: \"kubernetes.io/projected/be3de2e0-a404-4801-ad4a-3190a154c69d-kube-api-access-kmgxj\") pod \"perf-node-gather-daemonset-mw2nd\" (UID: \"be3de2e0-a404-4801-ad4a-3190a154c69d\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.324620 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.324132 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.462811 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.462776 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd"] Apr 24 21:54:06.465987 ip-10-0-135-27 kubenswrapper[2566]: W0424 21:54:06.465958 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbe3de2e0_a404_4801_ad4a_3190a154c69d.slice/crio-512015fa4efef2754dc71ab9dc84c6e9cc297dcf4eecef6d34a2e5463f92160f WatchSource:0}: Error finding container 512015fa4efef2754dc71ab9dc84c6e9cc297dcf4eecef6d34a2e5463f92160f: Status 404 returned error can't find the container with id 512015fa4efef2754dc71ab9dc84c6e9cc297dcf4eecef6d34a2e5463f92160f Apr 24 21:54:06.742920 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.742848 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" event={"ID":"be3de2e0-a404-4801-ad4a-3190a154c69d","Type":"ContainerStarted","Data":"f1a0ff00d49337fb7990de117dd3bba95ddfd631880c57415d67f66c8502fd28"} Apr 24 21:54:06.742920 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.742884 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" event={"ID":"be3de2e0-a404-4801-ad4a-3190a154c69d","Type":"ContainerStarted","Data":"512015fa4efef2754dc71ab9dc84c6e9cc297dcf4eecef6d34a2e5463f92160f"} Apr 24 21:54:06.743097 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.743018 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:06.763250 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.763202 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" podStartSLOduration=1.7631857279999998 podStartE2EDuration="1.763185728s" podCreationTimestamp="2026-04-24 21:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:54:06.76249501 +0000 UTC m=+1508.724260031" watchObservedRunningTime="2026-04-24 21:54:06.763185728 +0000 UTC m=+1508.724950751" Apr 24 21:54:06.907816 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.907787 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6jxzc_e77a2c8d-6381-4990-b43d-cfa87c8c3fc4/dns/0.log" Apr 24 21:54:06.932270 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:06.932241 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6jxzc_e77a2c8d-6381-4990-b43d-cfa87c8c3fc4/kube-rbac-proxy/0.log" Apr 24 21:54:07.042220 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:07.042193 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-88hpr_4ca03565-9d72-4813-9179-7636908c9bf5/dns-node-resolver/0.log" Apr 24 21:54:07.580036 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:07.580007 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-77758d8cd8-5h489_ee2da17b-d0d2-42ab-9cbc-b5b16c6c971b/registry/0.log" Apr 24 21:54:07.622018 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:07.621991 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4rz25_918a384b-26d7-496e-b3c9-370b6e526ebd/node-ca/0.log" Apr 24 21:54:08.876333 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:08.876259 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z2pdw_cc70ee65-8f01-4bad-adc7-98b5e7037c77/serve-healthcheck-canary/0.log" Apr 24 21:54:09.242454 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:09.242385 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2bz6f_4b511ab9-78fa-467d-88fa-1910c1b4f6bd/kube-rbac-proxy/0.log" Apr 24 21:54:09.265095 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:09.265065 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2bz6f_4b511ab9-78fa-467d-88fa-1910c1b4f6bd/exporter/0.log" Apr 24 21:54:09.287901 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:09.287876 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2bz6f_4b511ab9-78fa-467d-88fa-1910c1b4f6bd/extractor/0.log" Apr 24 21:54:11.353362 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:11.353315 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-67f77cd7d7-csgss_dbd0b7c6-e4c2-4b40-b21e-c57a4e6b73f5/manager/0.log" Apr 24 21:54:11.374623 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:11.374596 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-t67f6_aab67daa-d81e-4186-8df4-327c70f44ca4/manager/0.log" Apr 24 21:54:11.398324 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:11.398305 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-krnwt_d87e41ba-2228-4bdc-b1e8-13a4b8445ed2/server/0.log" Apr 24 21:54:12.758752 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:12.758722 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-mw2nd" Apr 24 21:54:17.003482 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.003456 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m5nxz_126ae8c8-5c56-4044-b72b-57fc091713c4/kube-multus-additional-cni-plugins/0.log" Apr 24 21:54:17.035969 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.035943 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m5nxz_126ae8c8-5c56-4044-b72b-57fc091713c4/egress-router-binary-copy/0.log" Apr 24 21:54:17.063153 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.063129 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m5nxz_126ae8c8-5c56-4044-b72b-57fc091713c4/cni-plugins/0.log" Apr 24 21:54:17.089147 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.089126 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m5nxz_126ae8c8-5c56-4044-b72b-57fc091713c4/bond-cni-plugin/0.log" Apr 24 21:54:17.113243 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.113224 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m5nxz_126ae8c8-5c56-4044-b72b-57fc091713c4/routeoverride-cni/0.log" Apr 24 21:54:17.136592 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.136563 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m5nxz_126ae8c8-5c56-4044-b72b-57fc091713c4/whereabouts-cni-bincopy/0.log" Apr 24 21:54:17.162740 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.162712 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m5nxz_126ae8c8-5c56-4044-b72b-57fc091713c4/whereabouts-cni/0.log" Apr 24 21:54:17.557678 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.557652 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n6qzm_55ef112d-fe31-4b57-808d-d33898e3e457/kube-multus/0.log" Apr 24 21:54:17.709580 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.709516 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q52j5_186d543e-d0f4-4e11-ac28-e8ebb35c72a2/network-metrics-daemon/0.log" Apr 24 21:54:17.736609 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:17.736586 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q52j5_186d543e-d0f4-4e11-ac28-e8ebb35c72a2/kube-rbac-proxy/0.log" Apr 24 21:54:18.789145 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:18.789106 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-controller/0.log" Apr 24 21:54:18.811547 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:18.811523 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/0.log" Apr 24 21:54:18.818643 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:18.818618 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovn-acl-logging/1.log" Apr 24 21:54:18.843480 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:18.843454 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/kube-rbac-proxy-node/0.log" Apr 24 21:54:18.865610 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:18.865578 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:54:18.886021 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:18.886005 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/northd/0.log" Apr 24 21:54:18.910536 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:18.910502 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/nbdb/0.log" Apr 24 21:54:18.937281 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:18.937244 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/sbdb/0.log" Apr 24 21:54:19.047631 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:19.047572 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qccqq_5a001c67-cdac-4386-8012-1386fdcf8bbd/ovnkube-controller/0.log" Apr 24 21:54:20.315684 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:20.315652 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-m789j_1bf8df67-f18f-4c8d-816d-cd4a03327ba3/check-endpoints/0.log" Apr 24 21:54:20.369802 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:20.369777 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tpck9_9e66f6e5-929b-4807-810a-ad84e15bb98f/network-check-target-container/0.log" Apr 24 21:54:21.312850 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:21.312818 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-p55z9_82cb470a-4ce5-4007-a453-0ac73804ef24/iptables-alerter/0.log" Apr 24 21:54:22.007727 ip-10-0-135-27 kubenswrapper[2566]: I0424 21:54:22.007699 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ltjjp_ce3cc5f8-62fb-41a2-9be9-2e22c637379e/tuned/0.log"