Apr 23 01:10:00.715321 ip-10-0-137-21 systemd[1]: Starting Kubernetes Kubelet... Apr 23 01:10:01.138874 ip-10-0-137-21 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:10:01.138874 ip-10-0-137-21 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 01:10:01.138874 ip-10-0-137-21 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:10:01.138874 ip-10-0-137-21 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 01:10:01.138874 ip-10-0-137-21 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:10:01.139625 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.139498 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 01:10:01.143432 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143415 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:01.143432 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143431 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:01.143432 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143434 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143437 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143442 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143446 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143450 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143453 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143456 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143459 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143462 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143465 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143468 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143470 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143473 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143476 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143478 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143481 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143483 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143486 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143489 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:01.143541 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143491 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143494 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143497 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143499 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143502 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143511 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143514 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143517 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143520 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143522 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143526 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143528 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143531 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143533 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143536 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143539 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143542 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143545 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143548 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:01.144006 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143551 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143554 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143557 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143559 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143562 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143564 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143567 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143569 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143572 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143575 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143578 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143581 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143584 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143587 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143589 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143592 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143595 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143597 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143600 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143602 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:01.144520 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143605 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143607 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143612 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143616 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143619 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143622 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143624 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143627 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143631 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143634 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143637 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143640 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143643 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143645 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143649 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143651 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143654 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143656 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143659 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143661 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:01.145032 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143665 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143667 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143669 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143672 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143674 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.143677 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144070 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144075 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144078 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144080 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144083 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144086 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144090 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144094 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144097 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144101 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144104 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144107 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144110 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:01.145513 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144112 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144116 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144120 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144122 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144125 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144128 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144131 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144134 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144137 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144139 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144142 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144145 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144147 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144150 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144152 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144155 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144158 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144160 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144163 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144166 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:01.145976 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144168 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144171 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144173 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144176 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144178 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144181 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144184 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144186 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144189 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144191 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144193 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144196 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144199 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144203 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144206 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144208 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144211 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144214 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144217 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144220 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:01.146491 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144224 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144227 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144231 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144234 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144237 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144239 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144242 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144245 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144247 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144250 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144253 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144255 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144258 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144260 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144263 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144266 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144268 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144271 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144274 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:01.147092 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144276 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144279 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144282 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144285 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144288 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144291 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144294 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144296 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144299 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144301 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144304 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144307 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144309 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.144312 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145588 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145598 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145628 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145633 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145638 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145642 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145646 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 01:10:01.147579 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145652 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145656 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145659 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145663 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145667 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145670 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145673 2576 flags.go:64] FLAG: --cgroup-root="" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145676 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145679 2576 flags.go:64] FLAG: --client-ca-file="" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145682 2576 flags.go:64] FLAG: --cloud-config="" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145685 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145688 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145694 2576 flags.go:64] FLAG: --cluster-domain="" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145697 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145700 2576 flags.go:64] FLAG: --config-dir="" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145703 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145706 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145711 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145714 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145717 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145721 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145724 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145727 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145730 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145734 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 01:10:01.148128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145737 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145742 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145745 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145748 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145751 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145754 2576 flags.go:64] FLAG: --enable-server="true" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145757 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145762 2576 flags.go:64] FLAG: --event-burst="100" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145765 2576 flags.go:64] FLAG: --event-qps="50" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145768 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145771 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145774 2576 flags.go:64] FLAG: --eviction-hard="" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145778 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145781 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145784 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145787 2576 flags.go:64] FLAG: --eviction-soft="" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145790 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145793 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145796 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145800 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145803 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145806 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145809 2576 flags.go:64] FLAG: --feature-gates="" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145812 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145815 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 01:10:01.148751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145818 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145822 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145825 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145828 2576 flags.go:64] FLAG: --help="false" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145831 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145834 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145837 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145840 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145843 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145847 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145850 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145853 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145856 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145859 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145862 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145865 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145868 2576 flags.go:64] FLAG: --kube-reserved="" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145871 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145874 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145877 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145880 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145883 2576 flags.go:64] FLAG: --lock-file="" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145886 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145889 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 01:10:01.149373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145892 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145900 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145903 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145906 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145909 2576 flags.go:64] FLAG: --logging-format="text" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145912 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145915 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145918 2576 flags.go:64] FLAG: --manifest-url="" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145922 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145926 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145929 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145933 2576 flags.go:64] FLAG: --max-pods="110" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145936 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145939 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145942 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145946 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145949 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145952 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145955 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145963 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145967 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145970 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145973 2576 flags.go:64] FLAG: --pod-cidr="" Apr 23 01:10:01.149975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145976 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.145996 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146000 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146003 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146006 2576 flags.go:64] FLAG: --port="10250" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146010 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146013 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-031e49dfe2b258f58" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146016 2576 flags.go:64] FLAG: --qos-reserved="" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146019 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146022 2576 flags.go:64] FLAG: --register-node="true" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146025 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146028 2576 flags.go:64] FLAG: --register-with-taints="" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146032 2576 flags.go:64] FLAG: --registry-burst="10" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146035 2576 flags.go:64] FLAG: --registry-qps="5" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146038 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146041 2576 flags.go:64] FLAG: --reserved-memory="" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146045 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146048 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146051 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146054 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146057 2576 flags.go:64] FLAG: --runonce="false" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146060 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146063 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146066 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146069 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146072 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 01:10:01.150589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146075 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146078 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146088 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146092 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146095 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146098 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146101 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146104 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146107 2576 flags.go:64] FLAG: --system-cgroups="" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146110 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146116 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146119 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146122 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146127 2576 flags.go:64] FLAG: --tls-min-version="" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146130 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146133 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146136 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146139 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146142 2576 flags.go:64] FLAG: --v="2" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146151 2576 flags.go:64] FLAG: --version="false" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146155 2576 flags.go:64] FLAG: --vmodule="" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146160 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.146163 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146263 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146267 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:01.151275 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146270 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146273 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146276 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146279 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146281 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146285 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146288 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146290 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146293 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146295 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146299 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146306 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146308 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146311 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146314 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146316 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146321 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146324 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146327 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:01.151897 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146329 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146332 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146334 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146337 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146340 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146342 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146345 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146347 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146351 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146354 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146356 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146359 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146361 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146364 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146366 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146371 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146375 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146379 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146382 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:01.152477 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146385 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146388 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146390 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146393 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146395 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146400 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146402 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146405 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146408 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146410 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146414 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146417 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146420 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146422 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146425 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146427 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146430 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146432 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146435 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146438 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:01.152944 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146440 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146443 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146445 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146448 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146450 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146453 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146456 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146458 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146460 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146463 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146466 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146468 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146471 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146473 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146476 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146478 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146481 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146484 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146487 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146490 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:01.153458 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146493 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146495 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146500 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146502 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146505 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.146508 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.147158 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.153609 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.153626 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153692 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153697 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153701 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153704 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153707 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153710 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153713 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:01.153955 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153716 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153719 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153722 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153725 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153727 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153730 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153733 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153735 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153738 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153740 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153743 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153746 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153748 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153751 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153755 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153758 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153761 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153764 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153766 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153769 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:01.154430 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153771 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153773 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153776 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153779 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153782 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153785 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153787 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153790 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153793 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153795 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153798 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153800 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153803 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153805 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153808 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153810 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153812 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153815 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153818 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153820 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:01.154968 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153823 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153825 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153828 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153830 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153833 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153835 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153838 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153842 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153844 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153847 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153849 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153852 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153854 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153857 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153859 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153862 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153866 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153869 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153871 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153874 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:01.155501 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153877 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153879 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153882 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153886 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153890 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153894 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153898 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153901 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153905 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153907 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153910 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153912 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153915 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153918 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153920 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153923 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153925 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153928 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:01.156011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.153931 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.153936 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154059 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154064 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154068 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154071 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154075 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154080 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154083 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154085 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154089 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154091 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154094 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154097 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154100 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:01.156485 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154103 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154105 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154108 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154110 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154113 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154117 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154120 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154123 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154126 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154128 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154131 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154133 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154136 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154138 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154141 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154143 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154146 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154148 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154151 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:01.156866 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154153 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154156 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154159 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154163 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154165 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154168 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154170 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154173 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154176 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154178 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154181 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154183 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154186 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154188 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154191 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154193 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154196 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154198 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154201 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154203 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:01.157371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154206 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154208 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154211 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154213 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154216 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154218 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154221 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154223 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154226 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154228 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154230 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154233 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154235 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154238 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154241 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154244 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154247 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154249 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154252 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154254 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:01.157855 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154257 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154259 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154262 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154264 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154267 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154270 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154272 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154275 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154277 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154279 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154282 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154284 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154287 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:01.154289 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.154294 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:10:01.158371 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.154420 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 01:10:01.158749 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.156413 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 01:10:01.158749 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.157277 2576 server.go:1019] "Starting client certificate rotation" Apr 23 01:10:01.158749 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.157376 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 01:10:01.158749 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.157412 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 01:10:01.182593 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.182562 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 01:10:01.190115 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.190090 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 01:10:01.205198 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.205171 2576 log.go:25] "Validated CRI v1 runtime API" Apr 23 01:10:01.210393 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.210369 2576 log.go:25] "Validated CRI v1 image API" Apr 23 01:10:01.211765 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.211734 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 01:10:01.216449 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.216428 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 01:10:01.217600 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.217579 2576 fs.go:135] Filesystem UUIDs: map[5bfe4f34-c69d-4886-9dfe-3aeb60cb6863:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 89f4d0ee-c1cb-4571-b5c9-89d015df1beb:/dev/nvme0n1p3] Apr 23 01:10:01.217656 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.217600 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 01:10:01.223949 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.223817 2576 manager.go:217] Machine: {Timestamp:2026-04-23 01:10:01.221589774 +0000 UTC m=+0.396385396 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100030 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bbbbdfd5173f931107062541a582f SystemUUID:ec2bbbbd-fd51-73f9-3110-7062541a582f BootID:f9fd36b5-3747-4c1c-b8a3-9a869f616f33 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:63:c2:9d:64:fd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:63:c2:9d:64:fd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ea:5e:58:22:be:fa Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 01:10:01.223949 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.223938 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 01:10:01.224087 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.224049 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 01:10:01.225234 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.225207 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 01:10:01.225412 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.225236 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-21.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 01:10:01.225458 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.225422 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 01:10:01.225458 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.225432 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 01:10:01.225458 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.225446 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 01:10:01.226333 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.226320 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 01:10:01.227084 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.227074 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 01:10:01.227192 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.227183 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 01:10:01.229500 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.229477 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 23 01:10:01.229554 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.229510 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 01:10:01.229554 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.229523 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 01:10:01.229554 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.229532 2576 kubelet.go:397] "Adding apiserver pod source" Apr 23 01:10:01.229554 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.229542 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 01:10:01.230594 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.230581 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 01:10:01.230634 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.230601 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 01:10:01.232798 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.232782 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f6z7x" Apr 23 01:10:01.234287 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.234268 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 01:10:01.235910 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.235896 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 01:10:01.237947 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.237936 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 01:10:01.237993 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.237955 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 01:10:01.237993 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.237962 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 01:10:01.237993 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.237967 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 01:10:01.237993 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.237974 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 01:10:01.237993 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.237994 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 01:10:01.238127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.238003 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 01:10:01.238127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.238009 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 01:10:01.238127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.238016 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 01:10:01.238127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.238023 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 01:10:01.238127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.238031 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 01:10:01.238127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.238040 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 01:10:01.238970 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.238961 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 01:10:01.239019 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.238971 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 01:10:01.239944 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.239925 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f6z7x" Apr 23 01:10:01.241674 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.241652 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-21.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 01:10:01.241789 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.241769 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-21.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 01:10:01.241897 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.241874 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 01:10:01.242960 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.242946 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 01:10:01.243027 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.243009 2576 server.go:1295] "Started kubelet" Apr 23 01:10:01.243111 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.243075 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 01:10:01.243221 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.243152 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 01:10:01.243256 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.243243 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 01:10:01.244063 ip-10-0-137-21 systemd[1]: Started Kubernetes Kubelet. Apr 23 01:10:01.244864 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.244844 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 01:10:01.246963 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.246944 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 23 01:10:01.251522 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.251501 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 01:10:01.251631 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.251503 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 01:10:01.252331 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252283 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 01:10:01.252331 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252309 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 01:10:01.252460 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.252385 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:01.252523 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252469 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 23 01:10:01.252523 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252478 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 23 01:10:01.252652 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252628 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 01:10:01.252652 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252660 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 01:10:01.252811 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252675 2576 factory.go:55] Registering systemd factory Apr 23 01:10:01.252811 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252685 2576 factory.go:223] Registration of the systemd container factory successfully Apr 23 01:10:01.252970 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.252951 2576 factory.go:153] Registering CRI-O factory Apr 23 01:10:01.253090 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.253013 2576 factory.go:223] Registration of the crio container factory successfully Apr 23 01:10:01.253090 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.253039 2576 factory.go:103] Registering Raw factory Apr 23 01:10:01.253090 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.253084 2576 manager.go:1196] Started watching for new ooms in manager Apr 23 01:10:01.253785 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.253770 2576 manager.go:319] Starting recovery of all containers Apr 23 01:10:01.254304 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.254285 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:01.255211 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.255188 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 01:10:01.257547 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.257521 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-21.ec2.internal\" not found" node="ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.263690 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.263563 2576 manager.go:324] Recovery completed Apr 23 01:10:01.268060 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.268046 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:01.270402 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.270380 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:01.270461 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.270415 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:01.270461 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.270427 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:01.270913 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.270900 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 01:10:01.270960 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.270912 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 01:10:01.270960 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.270930 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 01:10:01.272954 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.272942 2576 policy_none.go:49] "None policy: Start" Apr 23 01:10:01.273012 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.272959 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 01:10:01.273012 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.272969 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 23 01:10:01.316697 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.316677 2576 manager.go:341] "Starting Device Plugin manager" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.316714 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.316724 2576 server.go:85] "Starting device plugin registration server" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.317071 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.317085 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.317239 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.317337 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.317348 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.318247 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 01:10:01.319377 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.318285 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:01.378238 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.378197 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 01:10:01.379519 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.379497 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 01:10:01.379578 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.379528 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 01:10:01.379578 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.379558 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 01:10:01.379578 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.379566 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 01:10:01.379698 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.379610 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 01:10:01.382849 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.382830 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:01.417913 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.417830 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:01.418933 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.418912 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:01.419087 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.418947 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:01.419087 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.418958 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:01.419087 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.418999 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.425124 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.425101 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.425124 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.425130 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-21.ec2.internal\": node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:01.447455 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.447421 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:01.479923 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.479872 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal"] Apr 23 01:10:01.480089 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.480004 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:01.481020 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.481001 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:01.481110 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.481038 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:01.481110 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.481049 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:01.482289 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.482278 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:01.482462 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.482448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.482496 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.482480 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:01.483031 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.483014 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:01.483149 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.483039 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:01.483149 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.483016 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:01.483149 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.483049 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:01.483149 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.483073 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:01.483149 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.483090 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:01.484244 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.484225 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.484337 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.484258 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:01.484933 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.484921 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:01.485016 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.484947 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:01.485016 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.484961 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:01.510793 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.510762 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-21.ec2.internal\" not found" node="ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.515177 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.515159 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-21.ec2.internal\" not found" node="ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.547662 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.547623 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:01.553926 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.553901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/359cc43887581153d61348aa08e1c846-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal\" (UID: \"359cc43887581153d61348aa08e1c846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.554002 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.553931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/359cc43887581153d61348aa08e1c846-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal\" (UID: \"359cc43887581153d61348aa08e1c846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.554002 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.553950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e27bb1972aa991595ea629c18476d369-config\") pod \"kube-apiserver-proxy-ip-10-0-137-21.ec2.internal\" (UID: \"e27bb1972aa991595ea629c18476d369\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.648559 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.648510 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:01.654862 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.654836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/359cc43887581153d61348aa08e1c846-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal\" (UID: \"359cc43887581153d61348aa08e1c846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.654925 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.654868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/359cc43887581153d61348aa08e1c846-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal\" (UID: \"359cc43887581153d61348aa08e1c846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.654925 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.654890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e27bb1972aa991595ea629c18476d369-config\") pod \"kube-apiserver-proxy-ip-10-0-137-21.ec2.internal\" (UID: \"e27bb1972aa991595ea629c18476d369\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.655008 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.654940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e27bb1972aa991595ea629c18476d369-config\") pod \"kube-apiserver-proxy-ip-10-0-137-21.ec2.internal\" (UID: \"e27bb1972aa991595ea629c18476d369\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.655008 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.654954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/359cc43887581153d61348aa08e1c846-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal\" (UID: \"359cc43887581153d61348aa08e1c846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.655008 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.654940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/359cc43887581153d61348aa08e1c846-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal\" (UID: \"359cc43887581153d61348aa08e1c846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.749355 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.749279 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:01.812785 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.812760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.817386 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:01.817365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" Apr 23 01:10:01.850224 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.850186 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:01.950809 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:01.950766 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.051381 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:02.051280 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.151815 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:02.151779 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.157076 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.157059 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 01:10:02.157222 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.157205 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 01:10:02.157272 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.157244 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 01:10:02.241860 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.241808 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 01:05:01 +0000 UTC" deadline="2028-01-08 20:14:20.848205504 +0000 UTC" Apr 23 01:10:02.241860 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.241855 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15019h4m18.606354305s" Apr 23 01:10:02.251865 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:02.251836 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.251865 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.251858 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 01:10:02.262812 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.262779 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 01:10:02.279785 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.279753 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nc9d7" Apr 23 01:10:02.286650 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.286622 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nc9d7" Apr 23 01:10:02.352094 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:02.352014 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.375123 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:02.375090 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod359cc43887581153d61348aa08e1c846.slice/crio-ccfcbfccb213f273164e046c052179660b8c519353f9756cfc800457c92f3244 WatchSource:0}: Error finding container ccfcbfccb213f273164e046c052179660b8c519353f9756cfc800457c92f3244: Status 404 returned error can't find the container with id ccfcbfccb213f273164e046c052179660b8c519353f9756cfc800457c92f3244 Apr 23 01:10:02.375789 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:02.375766 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode27bb1972aa991595ea629c18476d369.slice/crio-767bee9ec949e7eacf918a1e2834b1d051968fcffcf7b8ecea7c0cf79d926fa6 WatchSource:0}: Error finding container 767bee9ec949e7eacf918a1e2834b1d051968fcffcf7b8ecea7c0cf79d926fa6: Status 404 returned error can't find the container with id 767bee9ec949e7eacf918a1e2834b1d051968fcffcf7b8ecea7c0cf79d926fa6 Apr 23 01:10:02.380195 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.380181 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:10:02.382197 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.382147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" event={"ID":"e27bb1972aa991595ea629c18476d369","Type":"ContainerStarted","Data":"767bee9ec949e7eacf918a1e2834b1d051968fcffcf7b8ecea7c0cf79d926fa6"} Apr 23 01:10:02.383310 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.383287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" event={"ID":"359cc43887581153d61348aa08e1c846","Type":"ContainerStarted","Data":"ccfcbfccb213f273164e046c052179660b8c519353f9756cfc800457c92f3244"} Apr 23 01:10:02.390918 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.390898 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:02.452680 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:02.452635 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.553163 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:02.553132 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.653670 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:02.653586 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.754431 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:02.754395 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-21.ec2.internal\" not found" Apr 23 01:10:02.812863 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.812833 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:02.852717 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.852683 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" Apr 23 01:10:02.863034 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.863003 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 01:10:02.863883 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.863858 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" Apr 23 01:10:02.871634 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:02.871610 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 01:10:03.153252 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.153166 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:03.231251 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.231214 2576 apiserver.go:52] "Watching apiserver" Apr 23 01:10:03.238126 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.238097 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 01:10:03.239402 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.239369 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-tsrhr","kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal","openshift-multus/multus-additional-cni-plugins-n9nhd","openshift-multus/multus-f5qrx","openshift-multus/network-metrics-daemon-755gj","openshift-network-diagnostics/network-check-target-2qwsm","openshift-network-operator/iptables-alerter-l9kb7","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92","openshift-cluster-node-tuning-operator/tuned-6r6mw","openshift-dns/node-resolver-mggnc","openshift-image-registry/node-ca-7nslb","openshift-ovn-kubernetes/ovnkube-node-8rfgs"] Apr 23 01:10:03.241855 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.241782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:03.241998 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.241867 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:03.244177 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.244075 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.247259 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.247234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.248127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.248105 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 01:10:03.252168 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.248438 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ftltm\"" Apr 23 01:10:03.252168 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.248658 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 01:10:03.252168 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.248847 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 01:10:03.252649 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.252623 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 01:10:03.252649 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.252648 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 01:10:03.252878 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.252848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 01:10:03.253192 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.252972 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vk6fd\"" Apr 23 01:10:03.256108 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.256083 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:03.256208 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.256173 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:03.258765 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.258457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:03.260664 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.260641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.260896 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.260734 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.262344 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.261972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 01:10:03.262344 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.262104 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-kl525\"" Apr 23 01:10:03.262344 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.262215 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 01:10:03.263070 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.263049 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 01:10:03.263721 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.263697 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 01:10:03.264182 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.263969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fe8791c-b433-4e16-983b-0550aa4d2b4d-cni-binary-copy\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264182 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-cni-bin\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264182 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-cni-multus\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264182 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-cni-binary-copy\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.264182 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-cni-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264182 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-cnibin\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264219 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vssrh\"" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhc2w\" (UniqueName: \"kubernetes.io/projected/3fe8791c-b433-4e16-983b-0550aa4d2b4d-kube-api-access-xhc2w\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264224 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-cnibin\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-k8s-cni-cncf-io\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-os-release\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264385 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-socket-dir-parent\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264428 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j4s9c\"" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-system-cni-dir\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-os-release\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-multus-certs\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.264601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264525 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-etc-kubernetes\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264681 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-system-cni-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-hostroot\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264794 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-daemon-config\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8h8p\" (UniqueName: \"kubernetes.io/projected/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-kube-api-access-p8h8p\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zjd6\" (UniqueName: \"kubernetes.io/projected/bf4044d9-01da-465d-a2bf-80556b56473d-kube-api-access-2zjd6\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.264967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-kubelet\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.265033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-netns\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.265073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-conf-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.263973 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 01:10:03.265280 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.265265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.267246 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.267083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:10:03.267771 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.267547 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2v2bq\"" Apr 23 01:10:03.267892 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.267875 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 01:10:03.269335 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.269137 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.270073 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.269275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.271479 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.271443 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 01:10:03.271629 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.271606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bmzbz\"" Apr 23 01:10:03.272380 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.271776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.272380 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.271842 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 01:10:03.272380 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.271879 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 01:10:03.272380 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.272089 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 01:10:03.272380 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.272122 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 01:10:03.272671 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.272475 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wfw99\"" Apr 23 01:10:03.274161 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.274136 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 01:10:03.275371 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.275353 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 01:10:03.275627 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.275611 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 01:10:03.275717 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.275662 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 01:10:03.275777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.275735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 01:10:03.275830 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.275667 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 01:10:03.276080 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.275904 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qtv6m\"" Apr 23 01:10:03.287723 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.287643 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 01:05:02 +0000 UTC" deadline="2028-01-31 22:38:51.895519878 +0000 UTC" Apr 23 01:10:03.287723 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.287674 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15573h28m48.607849953s" Apr 23 01:10:03.295957 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.295867 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:03.353916 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.353879 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 01:10:03.365230 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-cni-multus\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.365230 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fa213e84-39a9-48e0-8583-16baa3f074dc-iptables-alerter-script\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-socket-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c07cb850-56d2-465d-81a5-a218f5a3548f-tmp\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-cni-multus\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-cnibin\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhc2w\" (UniqueName: \"kubernetes.io/projected/3fe8791c-b433-4e16-983b-0550aa4d2b4d-kube-api-access-xhc2w\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptgck\" (UniqueName: \"kubernetes.io/projected/fa213e84-39a9-48e0-8583-16baa3f074dc-kube-api-access-ptgck\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-modprobe-d\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-cnibin\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0608d8b-418d-4240-b84e-dc09071f45b7-host\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.365466 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-k8s-cni-cncf-io\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49lhm\" (UniqueName: \"kubernetes.io/projected/13504342-083f-4f36-abd4-a1b6558edb3f-kube-api-access-49lhm\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-k8s-cni-cncf-io\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-var-lib-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-os-release\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa213e84-39a9-48e0-8583-16baa3f074dc-host-slash\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-sys\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-os-release\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-systemd\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-etc-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-system-cni-dir\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-os-release\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.365974 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-etc-kubernetes\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.365998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-system-cni-dir\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e8cf828b-7774-4149-88cf-e198ac5cf943-agent-certs\") pod \"konnectivity-agent-tsrhr\" (UID: \"e8cf828b-7774-4149-88cf-e198ac5cf943\") " pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-os-release\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-var-lib-kubelet\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-etc-kubernetes\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-host\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-cni-bin\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-system-cni-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e8cf828b-7774-4149-88cf-e198ac5cf943-konnectivity-ca\") pod \"konnectivity-agent-tsrhr\" (UID: \"e8cf828b-7774-4149-88cf-e198ac5cf943\") " pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-registration-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-tuned\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-system-cni-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovnkube-script-lib\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.366685 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zjd6\" (UniqueName: \"kubernetes.io/projected/bf4044d9-01da-465d-a2bf-80556b56473d-kube-api-access-2zjd6\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-kubelet\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hsv6\" (UniqueName: \"kubernetes.io/projected/9b3a130d-355a-4f19-ae09-87c95553254c-kube-api-access-4hsv6\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-kubelet\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysctl-d\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-run\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-systemd-units\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-node-log\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-conf-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-etc-selinux\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-conf-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-sys-fs\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-kubernetes\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdj4\" (UniqueName: \"kubernetes.io/projected/b0608d8b-418d-4240-b84e-dc09071f45b7-kube-api-access-dhdj4\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-env-overrides\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.367489 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fe8791c-b433-4e16-983b-0550aa4d2b4d-cni-binary-copy\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-cni-bin\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-device-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-run-netns\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-var-lib-cni-bin\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.366951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5bdp\" (UniqueName: \"kubernetes.io/projected/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-kube-api-access-j5bdp\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-cni-binary-copy\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-cni-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-cni-dir\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysctl-conf\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swj4r\" (UniqueName: \"kubernetes.io/projected/c07cb850-56d2-465d-81a5-a218f5a3548f-kube-api-access-swj4r\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0608d8b-418d-4240-b84e-dc09071f45b7-serviceca\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-slash\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-cni-netd\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.367241 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:03.368307 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-cnibin\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13504342-083f-4f36-abd4-a1b6558edb3f-tmp-dir\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.367315 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs podName:4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:03.867294854 +0000 UTC m=+3.042090485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs") pod "network-metrics-daemon-755gj" (UID: "4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-log-socket\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf4044d9-01da-465d-a2bf-80556b56473d-cnibin\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-socket-dir-parent\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fe8791c-b433-4e16-983b-0550aa4d2b4d-cni-binary-copy\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-lib-modules\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-socket-dir-parent\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13504342-083f-4f36-abd4-a1b6558edb3f-hosts-file\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf4044d9-01da-465d-a2bf-80556b56473d-cni-binary-copy\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-multus-certs\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-kubelet\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-multus-certs\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovnkube-config\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.369155 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-hostroot\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-daemon-config\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8h8p\" (UniqueName: \"kubernetes.io/projected/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-kube-api-access-p8h8p\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-ovn\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-hostroot\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-systemd\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-netns\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysconfig\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovn-node-metrics-cert\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.367925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fe8791c-b433-4e16-983b-0550aa4d2b4d-host-run-netns\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.369939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.368291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fe8791c-b433-4e16-983b-0550aa4d2b4d-multus-daemon-config\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.371533 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.371506 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 01:10:03.375070 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.375029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhc2w\" (UniqueName: \"kubernetes.io/projected/3fe8791c-b433-4e16-983b-0550aa4d2b4d-kube-api-access-xhc2w\") pod \"multus-f5qrx\" (UID: \"3fe8791c-b433-4e16-983b-0550aa4d2b4d\") " pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.375178 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.375074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zjd6\" (UniqueName: \"kubernetes.io/projected/bf4044d9-01da-465d-a2bf-80556b56473d-kube-api-access-2zjd6\") pod \"multus-additional-cni-plugins-n9nhd\" (UID: \"bf4044d9-01da-465d-a2bf-80556b56473d\") " pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.376727 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.376704 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:03.376839 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.376734 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:03.376839 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.376747 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t9jqp for pod openshift-network-diagnostics/network-check-target-2qwsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:03.376839 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.376820 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp podName:482d8bfd-9099-4d00-ae13-b2428f266503 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:03.876799888 +0000 UTC m=+3.051595514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-t9jqp" (UniqueName: "kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp") pod "network-check-target-2qwsm" (UID: "482d8bfd-9099-4d00-ae13-b2428f266503") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:03.379134 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.379106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8h8p\" (UniqueName: \"kubernetes.io/projected/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-kube-api-access-p8h8p\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:03.468782 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468676 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-lib-modules\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.468782 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13504342-083f-4f36-abd4-a1b6558edb3f-hosts-file\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.468782 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-kubelet\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.468782 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovnkube-config\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-ovn\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13504342-083f-4f36-abd4-a1b6558edb3f-hosts-file\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-systemd\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-lib-modules\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-kubelet\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysconfig\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-ovn\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovn-node-metrics-cert\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fa213e84-39a9-48e0-8583-16baa3f074dc-iptables-alerter-script\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-systemd\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.468940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysconfig\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-socket-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c07cb850-56d2-465d-81a5-a218f5a3548f-tmp\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptgck\" (UniqueName: \"kubernetes.io/projected/fa213e84-39a9-48e0-8583-16baa3f074dc-kube-api-access-ptgck\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.469093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-modprobe-d\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0608d8b-418d-4240-b84e-dc09071f45b7-host\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49lhm\" (UniqueName: \"kubernetes.io/projected/13504342-083f-4f36-abd4-a1b6558edb3f-kube-api-access-49lhm\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0608d8b-418d-4240-b84e-dc09071f45b7-host\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-var-lib-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-modprobe-d\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-var-lib-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa213e84-39a9-48e0-8583-16baa3f074dc-host-slash\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-sys\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-systemd\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-socket-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-etc-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa213e84-39a9-48e0-8583-16baa3f074dc-host-slash\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e8cf828b-7774-4149-88cf-e198ac5cf943-agent-certs\") pod \"konnectivity-agent-tsrhr\" (UID: \"e8cf828b-7774-4149-88cf-e198ac5cf943\") " pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.469845 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-run-systemd\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-etc-openvswitch\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-var-lib-kubelet\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-sys\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-host\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-host\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-var-lib-kubelet\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-cni-bin\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e8cf828b-7774-4149-88cf-e198ac5cf943-konnectivity-ca\") pod \"konnectivity-agent-tsrhr\" (UID: \"e8cf828b-7774-4149-88cf-e198ac5cf943\") " pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-cni-bin\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-registration-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-tuned\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-registration-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovnkube-script-lib\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hsv6\" (UniqueName: \"kubernetes.io/projected/9b3a130d-355a-4f19-ae09-87c95553254c-kube-api-access-4hsv6\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.470718 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysctl-d\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-run\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-systemd-units\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-node-log\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-etc-selinux\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-sys-fs\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-kubernetes\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.469966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdj4\" (UniqueName: \"kubernetes.io/projected/b0608d8b-418d-4240-b84e-dc09071f45b7-kube-api-access-dhdj4\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-env-overrides\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-device-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovnkube-config\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-run-netns\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-device-dir\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5bdp\" (UniqueName: \"kubernetes.io/projected/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-kube-api-access-j5bdp\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-etc-selinux\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysctl-conf\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9b3a130d-355a-4f19-ae09-87c95553254c-sys-fs\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.471505 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e8cf828b-7774-4149-88cf-e198ac5cf943-konnectivity-ca\") pod \"konnectivity-agent-tsrhr\" (UID: \"e8cf828b-7774-4149-88cf-e198ac5cf943\") " pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-run\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysctl-d\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-env-overrides\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-kubernetes\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-node-log\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-run-netns\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470532 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-systemd-units\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fa213e84-39a9-48e0-8583-16baa3f074dc-iptables-alerter-script\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-sysctl-conf\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swj4r\" (UniqueName: \"kubernetes.io/projected/c07cb850-56d2-465d-81a5-a218f5a3548f-kube-api-access-swj4r\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovnkube-script-lib\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0608d8b-418d-4240-b84e-dc09071f45b7-serviceca\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-slash\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-cni-netd\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13504342-083f-4f36-abd4-a1b6558edb3f-tmp-dir\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-log-socket\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470934 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-cni-netd\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472185 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.470943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-log-socket\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472881 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.471005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-host-slash\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.472881 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.471204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0608d8b-418d-4240-b84e-dc09071f45b7-serviceca\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.472881 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.471235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13504342-083f-4f36-abd4-a1b6558edb3f-tmp-dir\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.472881 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.471724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c07cb850-56d2-465d-81a5-a218f5a3548f-tmp\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.472881 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.472133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e8cf828b-7774-4149-88cf-e198ac5cf943-agent-certs\") pod \"konnectivity-agent-tsrhr\" (UID: \"e8cf828b-7774-4149-88cf-e198ac5cf943\") " pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:03.473140 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.472929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c07cb850-56d2-465d-81a5-a218f5a3548f-etc-tuned\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.473140 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.473064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-ovn-node-metrics-cert\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.477822 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.477794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptgck\" (UniqueName: \"kubernetes.io/projected/fa213e84-39a9-48e0-8583-16baa3f074dc-kube-api-access-ptgck\") pod \"iptables-alerter-l9kb7\" (UID: \"fa213e84-39a9-48e0-8583-16baa3f074dc\") " pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.478690 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.478617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49lhm\" (UniqueName: \"kubernetes.io/projected/13504342-083f-4f36-abd4-a1b6558edb3f-kube-api-access-49lhm\") pod \"node-resolver-mggnc\" (UID: \"13504342-083f-4f36-abd4-a1b6558edb3f\") " pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.480079 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.479125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5bdp\" (UniqueName: \"kubernetes.io/projected/5c95cbe5-ed9d-499f-b53a-66de0e3475e6-kube-api-access-j5bdp\") pod \"ovnkube-node-8rfgs\" (UID: \"5c95cbe5-ed9d-499f-b53a-66de0e3475e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.481915 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.481890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdj4\" (UniqueName: \"kubernetes.io/projected/b0608d8b-418d-4240-b84e-dc09071f45b7-kube-api-access-dhdj4\") pod \"node-ca-7nslb\" (UID: \"b0608d8b-418d-4240-b84e-dc09071f45b7\") " pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.482516 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.482493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swj4r\" (UniqueName: \"kubernetes.io/projected/c07cb850-56d2-465d-81a5-a218f5a3548f-kube-api-access-swj4r\") pod \"tuned-6r6mw\" (UID: \"c07cb850-56d2-465d-81a5-a218f5a3548f\") " pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.482628 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.482608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hsv6\" (UniqueName: \"kubernetes.io/projected/9b3a130d-355a-4f19-ae09-87c95553254c-kube-api-access-4hsv6\") pod \"aws-ebs-csi-driver-node-9wm92\" (UID: \"9b3a130d-355a-4f19-ae09-87c95553254c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.562691 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.562652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" Apr 23 01:10:03.571777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.571743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f5qrx" Apr 23 01:10:03.582608 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.582572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:03.589369 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.589336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l9kb7" Apr 23 01:10:03.597144 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.597114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" Apr 23 01:10:03.604836 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.604804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" Apr 23 01:10:03.610593 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.610570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mggnc" Apr 23 01:10:03.617288 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.617263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7nslb" Apr 23 01:10:03.623057 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.623028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:03.874092 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.873998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:03.874251 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.874129 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:03.874251 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.874200 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs podName:4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:04.874179571 +0000 UTC m=+4.048975185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs") pod "network-metrics-daemon-755gj" (UID: "4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:03.974394 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:03.974362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:03.974555 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.974504 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:03.974555 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.974525 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:03.974555 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.974534 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t9jqp for pod openshift-network-diagnostics/network-check-target-2qwsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:03.974685 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:03.974584 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp podName:482d8bfd-9099-4d00-ae13-b2428f266503 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:04.974570572 +0000 UTC m=+4.149366182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9jqp" (UniqueName: "kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp") pod "network-check-target-2qwsm" (UID: "482d8bfd-9099-4d00-ae13-b2428f266503") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:04.016590 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.016562 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c95cbe5_ed9d_499f_b53a_66de0e3475e6.slice/crio-1bc93cc35b2e34f23f0970fadf88cd721c1ba2b643c5b34d6738d57f9cdfa7f8 WatchSource:0}: Error finding container 1bc93cc35b2e34f23f0970fadf88cd721c1ba2b643c5b34d6738d57f9cdfa7f8: Status 404 returned error can't find the container with id 1bc93cc35b2e34f23f0970fadf88cd721c1ba2b643c5b34d6738d57f9cdfa7f8 Apr 23 01:10:04.018258 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.018236 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa213e84_39a9_48e0_8583_16baa3f074dc.slice/crio-96a828964a7bbaf670e34788f1fee3914407470acd80c7e9f37faa4285b9cd5c WatchSource:0}: Error finding container 96a828964a7bbaf670e34788f1fee3914407470acd80c7e9f37faa4285b9cd5c: Status 404 returned error can't find the container with id 96a828964a7bbaf670e34788f1fee3914407470acd80c7e9f37faa4285b9cd5c Apr 23 01:10:04.039487 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.039277 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe8791c_b433_4e16_983b_0550aa4d2b4d.slice/crio-627a3ef2d1f63521588f79b6480ee41542b2c4a99b63c50c6eec098133c826f9 WatchSource:0}: Error finding container 627a3ef2d1f63521588f79b6480ee41542b2c4a99b63c50c6eec098133c826f9: Status 404 returned error can't find the container with id 627a3ef2d1f63521588f79b6480ee41542b2c4a99b63c50c6eec098133c826f9 Apr 23 01:10:04.039692 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.039640 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc07cb850_56d2_465d_81a5_a218f5a3548f.slice/crio-34bd78b4be203f54d20559709bc6e73525365e54c21a4461d802a0b4228609b2 WatchSource:0}: Error finding container 34bd78b4be203f54d20559709bc6e73525365e54c21a4461d802a0b4228609b2: Status 404 returned error can't find the container with id 34bd78b4be203f54d20559709bc6e73525365e54c21a4461d802a0b4228609b2 Apr 23 01:10:04.040759 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.040471 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0608d8b_418d_4240_b84e_dc09071f45b7.slice/crio-374307e52bb3503234d9dc843a347f527f97870d784d6ca8c0e4d77423f655b5 WatchSource:0}: Error finding container 374307e52bb3503234d9dc843a347f527f97870d784d6ca8c0e4d77423f655b5: Status 404 returned error can't find the container with id 374307e52bb3503234d9dc843a347f527f97870d784d6ca8c0e4d77423f655b5 Apr 23 01:10:04.045693 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.045671 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13504342_083f_4f36_abd4_a1b6558edb3f.slice/crio-311632a3c012b97694e74980e862453985ab1759d793e0d047c8ed63fa656d6b WatchSource:0}: Error finding container 311632a3c012b97694e74980e862453985ab1759d793e0d047c8ed63fa656d6b: Status 404 returned error can't find the container with id 311632a3c012b97694e74980e862453985ab1759d793e0d047c8ed63fa656d6b Apr 23 01:10:04.046301 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.046280 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8cf828b_7774_4149_88cf_e198ac5cf943.slice/crio-aa0d62fe6c83f9afb6a4aaa84e07aac3268883123a42b0bf405ae5a0d301f3cd WatchSource:0}: Error finding container aa0d62fe6c83f9afb6a4aaa84e07aac3268883123a42b0bf405ae5a0d301f3cd: Status 404 returned error can't find the container with id aa0d62fe6c83f9afb6a4aaa84e07aac3268883123a42b0bf405ae5a0d301f3cd Apr 23 01:10:04.046889 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.046785 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3a130d_355a_4f19_ae09_87c95553254c.slice/crio-9947f1dc6be616093458e5fb8549a8a48c2f01b849542d0f9cee99caa9b3a51a WatchSource:0}: Error finding container 9947f1dc6be616093458e5fb8549a8a48c2f01b849542d0f9cee99caa9b3a51a: Status 404 returned error can't find the container with id 9947f1dc6be616093458e5fb8549a8a48c2f01b849542d0f9cee99caa9b3a51a Apr 23 01:10:04.047583 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:04.047494 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4044d9_01da_465d_a2bf_80556b56473d.slice/crio-9d0c5384c8271230da5152fdb6f0de14bfc58de2f4d5b84b27433aa5b39c641d WatchSource:0}: Error finding container 9d0c5384c8271230da5152fdb6f0de14bfc58de2f4d5b84b27433aa5b39c641d: Status 404 returned error can't find the container with id 9d0c5384c8271230da5152fdb6f0de14bfc58de2f4d5b84b27433aa5b39c641d Apr 23 01:10:04.288928 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.288787 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 01:05:02 +0000 UTC" deadline="2028-01-08 23:02:04.360294376 +0000 UTC" Apr 23 01:10:04.288928 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.288829 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15021h52m0.071469478s" Apr 23 01:10:04.391884 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.391324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" event={"ID":"e27bb1972aa991595ea629c18476d369","Type":"ContainerStarted","Data":"f49e24b8fae51d633bab0f5993cdbd79b288467449dc3ba7127a21835f2f5302"} Apr 23 01:10:04.393363 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.393327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" event={"ID":"9b3a130d-355a-4f19-ae09-87c95553254c","Type":"ContainerStarted","Data":"9947f1dc6be616093458e5fb8549a8a48c2f01b849542d0f9cee99caa9b3a51a"} Apr 23 01:10:04.395458 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.395417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tsrhr" event={"ID":"e8cf828b-7774-4149-88cf-e198ac5cf943","Type":"ContainerStarted","Data":"aa0d62fe6c83f9afb6a4aaa84e07aac3268883123a42b0bf405ae5a0d301f3cd"} Apr 23 01:10:04.397905 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.397850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mggnc" event={"ID":"13504342-083f-4f36-abd4-a1b6558edb3f","Type":"ContainerStarted","Data":"311632a3c012b97694e74980e862453985ab1759d793e0d047c8ed63fa656d6b"} Apr 23 01:10:04.400110 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.400080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5qrx" event={"ID":"3fe8791c-b433-4e16-983b-0550aa4d2b4d","Type":"ContainerStarted","Data":"627a3ef2d1f63521588f79b6480ee41542b2c4a99b63c50c6eec098133c826f9"} Apr 23 01:10:04.405467 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.405418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerStarted","Data":"9d0c5384c8271230da5152fdb6f0de14bfc58de2f4d5b84b27433aa5b39c641d"} Apr 23 01:10:04.407881 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.407809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7nslb" event={"ID":"b0608d8b-418d-4240-b84e-dc09071f45b7","Type":"ContainerStarted","Data":"374307e52bb3503234d9dc843a347f527f97870d784d6ca8c0e4d77423f655b5"} Apr 23 01:10:04.410107 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.409822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l9kb7" event={"ID":"fa213e84-39a9-48e0-8583-16baa3f074dc","Type":"ContainerStarted","Data":"96a828964a7bbaf670e34788f1fee3914407470acd80c7e9f37faa4285b9cd5c"} Apr 23 01:10:04.412190 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.412140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" event={"ID":"c07cb850-56d2-465d-81a5-a218f5a3548f","Type":"ContainerStarted","Data":"34bd78b4be203f54d20559709bc6e73525365e54c21a4461d802a0b4228609b2"} Apr 23 01:10:04.414275 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.414235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"1bc93cc35b2e34f23f0970fadf88cd721c1ba2b643c5b34d6738d57f9cdfa7f8"} Apr 23 01:10:04.882786 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.882747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:04.882961 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:04.882886 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:04.882961 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:04.882942 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs podName:4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:06.88292416 +0000 UTC m=+6.057719770 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs") pod "network-metrics-daemon-755gj" (UID: "4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:04.983405 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:04.983369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:04.983568 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:04.983543 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:04.983568 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:04.983561 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:04.983675 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:04.983574 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t9jqp for pod openshift-network-diagnostics/network-check-target-2qwsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:04.983675 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:04.983637 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp podName:482d8bfd-9099-4d00-ae13-b2428f266503 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:06.983619439 +0000 UTC m=+6.158415049 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9jqp" (UniqueName: "kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp") pod "network-check-target-2qwsm" (UID: "482d8bfd-9099-4d00-ae13-b2428f266503") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:05.382473 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:05.382438 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:05.383006 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:05.382581 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:05.383121 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:05.383100 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:05.383252 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:05.383218 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:05.442360 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:05.441137 2576 generic.go:358] "Generic (PLEG): container finished" podID="359cc43887581153d61348aa08e1c846" containerID="4269e8d4597d43d56f99340009aa86a414980ad4bd9f7d820fbfd05568401a84" exitCode=0 Apr 23 01:10:05.442360 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:05.442137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" event={"ID":"359cc43887581153d61348aa08e1c846","Type":"ContainerDied","Data":"4269e8d4597d43d56f99340009aa86a414980ad4bd9f7d820fbfd05568401a84"} Apr 23 01:10:05.454998 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:05.454661 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-21.ec2.internal" podStartSLOduration=3.454638016 podStartE2EDuration="3.454638016s" podCreationTimestamp="2026-04-23 01:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:10:04.403900942 +0000 UTC m=+3.578696575" watchObservedRunningTime="2026-04-23 01:10:05.454638016 +0000 UTC m=+4.629433650" Apr 23 01:10:06.460357 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:06.460316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" event={"ID":"359cc43887581153d61348aa08e1c846","Type":"ContainerStarted","Data":"e4ec4c6bc1578e1f4085c777efff88fa9c3ae504cade504cd0845b8543399fa9"} Apr 23 01:10:06.474075 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:06.473091 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-21.ec2.internal" podStartSLOduration=4.473073879 podStartE2EDuration="4.473073879s" podCreationTimestamp="2026-04-23 01:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:10:06.472818376 +0000 UTC m=+5.647614010" watchObservedRunningTime="2026-04-23 01:10:06.473073879 +0000 UTC m=+5.647869509" Apr 23 01:10:06.898930 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:06.898755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:06.898930 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:06.898918 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:06.899177 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:06.899006 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs podName:4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:10.89897245 +0000 UTC m=+10.073768075 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs") pod "network-metrics-daemon-755gj" (UID: "4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:07.000123 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:07.000085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:07.000313 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:07.000254 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:07.000313 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:07.000273 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:07.000313 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:07.000285 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t9jqp for pod openshift-network-diagnostics/network-check-target-2qwsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:07.000468 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:07.000341 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp podName:482d8bfd-9099-4d00-ae13-b2428f266503 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:11.000322456 +0000 UTC m=+10.175118073 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9jqp" (UniqueName: "kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp") pod "network-check-target-2qwsm" (UID: "482d8bfd-9099-4d00-ae13-b2428f266503") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:07.380355 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:07.380266 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:07.380523 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:07.380402 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:07.380795 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:07.380771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:07.380893 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:07.380867 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:09.380807 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:09.380765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:09.381483 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:09.380939 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:09.381483 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:09.380765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:09.381483 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:09.381117 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:10.931995 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:10.931919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:10.932521 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:10.932113 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:10.932521 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:10.932178 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs podName:4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:18.932159621 +0000 UTC m=+18.106955235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs") pod "network-metrics-daemon-755gj" (UID: "4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:11.033250 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:11.033200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:11.033438 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:11.033421 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:11.033514 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:11.033444 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:11.033514 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:11.033457 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t9jqp for pod openshift-network-diagnostics/network-check-target-2qwsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:11.033604 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:11.033528 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp podName:482d8bfd-9099-4d00-ae13-b2428f266503 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:19.033509099 +0000 UTC m=+18.208304715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9jqp" (UniqueName: "kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp") pod "network-check-target-2qwsm" (UID: "482d8bfd-9099-4d00-ae13-b2428f266503") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:11.381388 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:11.381326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:11.381589 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:11.381497 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:11.381589 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:11.381551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:11.381697 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:11.381647 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:13.380740 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.380657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:13.381130 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.380657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:13.381130 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:13.380803 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:13.381130 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:13.380853 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:13.735903 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.735817 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7hf77"] Apr 23 01:10:13.738642 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.738611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.738772 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:13.738701 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:13.853574 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.853528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-kubelet-config\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.853574 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.853574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.853800 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.853645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-dbus\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.954082 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.954031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-kubelet-config\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.954082 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.954078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.954359 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.954124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-dbus\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.954359 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.954165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-kubelet-config\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.954359 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:13.954284 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:13.954359 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:13.954338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-dbus\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:13.954542 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:13.954372 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret podName:4230f793-aa40-4a76-8cb9-6d4d5a0ea43b nodeName:}" failed. No retries permitted until 2026-04-23 01:10:14.45435097 +0000 UTC m=+13.629146604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret") pod "global-pull-secret-syncer-7hf77" (UID: "4230f793-aa40-4a76-8cb9-6d4d5a0ea43b") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:14.458062 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:14.458019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:14.458509 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:14.458144 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:14.458509 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:14.458220 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret podName:4230f793-aa40-4a76-8cb9-6d4d5a0ea43b nodeName:}" failed. No retries permitted until 2026-04-23 01:10:15.458202028 +0000 UTC m=+14.632997641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret") pod "global-pull-secret-syncer-7hf77" (UID: "4230f793-aa40-4a76-8cb9-6d4d5a0ea43b") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:15.380244 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:15.380212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:15.380244 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:15.380235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:15.380484 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:15.380341 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:15.380484 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:15.380372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:15.380484 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:15.380469 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:15.380680 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:15.380571 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:15.465028 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:15.464975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:15.465498 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:15.465137 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:15.465498 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:15.465216 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret podName:4230f793-aa40-4a76-8cb9-6d4d5a0ea43b nodeName:}" failed. No retries permitted until 2026-04-23 01:10:17.465189838 +0000 UTC m=+16.639985482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret") pod "global-pull-secret-syncer-7hf77" (UID: "4230f793-aa40-4a76-8cb9-6d4d5a0ea43b") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:17.380658 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:17.380626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:17.381082 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:17.380764 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:17.381667 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:17.381182 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:17.381667 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:17.381277 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:17.381667 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:17.381536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:17.381667 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:17.381624 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:17.481669 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:17.481586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:17.481815 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:17.481728 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:17.481815 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:17.481801 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret podName:4230f793-aa40-4a76-8cb9-6d4d5a0ea43b nodeName:}" failed. No retries permitted until 2026-04-23 01:10:21.481780732 +0000 UTC m=+20.656576342 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret") pod "global-pull-secret-syncer-7hf77" (UID: "4230f793-aa40-4a76-8cb9-6d4d5a0ea43b") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:18.994366 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:18.994313 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:18.994863 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:18.994486 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:18.994863 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:18.994570 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs podName:4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:34.994551257 +0000 UTC m=+34.169346867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs") pod "network-metrics-daemon-755gj" (UID: "4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:19.095268 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:19.095220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:19.095468 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:19.095399 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:19.095468 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:19.095424 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:19.095468 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:19.095439 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t9jqp for pod openshift-network-diagnostics/network-check-target-2qwsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:19.095632 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:19.095518 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp podName:482d8bfd-9099-4d00-ae13-b2428f266503 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:35.095497821 +0000 UTC m=+34.270293431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9jqp" (UniqueName: "kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp") pod "network-check-target-2qwsm" (UID: "482d8bfd-9099-4d00-ae13-b2428f266503") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:19.380156 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:19.380064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:19.380328 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:19.380064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:19.380328 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:19.380210 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:19.380328 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:19.380279 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:19.380328 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:19.380064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:19.380543 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:19.380395 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:21.380762 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.380731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:21.381225 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:21.380868 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:21.381345 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.381324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:21.381447 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:21.381427 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:21.381556 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.381539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:21.381655 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:21.381635 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:21.487968 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.487729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" event={"ID":"9b3a130d-355a-4f19-ae09-87c95553254c","Type":"ContainerStarted","Data":"cec2fad147a120a2b769fd489821c14cfee3838b6fd609eec0fca3965d93848f"} Apr 23 01:10:21.490563 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.490519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5qrx" event={"ID":"3fe8791c-b433-4e16-983b-0550aa4d2b4d","Type":"ContainerStarted","Data":"9cbbe3c7fbd7809bc04811c3e8fcc89a54afe9a71db4a8cadd74ca156dfc60b0"} Apr 23 01:10:21.491942 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.491916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerStarted","Data":"6e7ce037a4449c4fe9d1c7bab4ceacc2b9b18f8b73d504efd5ad0242bd9b6470"} Apr 23 01:10:21.493312 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.493285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7nslb" event={"ID":"b0608d8b-418d-4240-b84e-dc09071f45b7","Type":"ContainerStarted","Data":"672dd58e9e894ac8dc677d99771d80021edfd79afd49eafcb165f352bd8dad26"} Apr 23 01:10:21.494541 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.494515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" event={"ID":"c07cb850-56d2-465d-81a5-a218f5a3548f","Type":"ContainerStarted","Data":"49d1d843c1e4b1d3ea2c2ce4877b64302bdf00e14fd81d9e054b4651318fe5cd"} Apr 23 01:10:21.510839 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.510809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:21.511027 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:21.511004 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:21.511132 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:21.511066 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret podName:4230f793-aa40-4a76-8cb9-6d4d5a0ea43b nodeName:}" failed. No retries permitted until 2026-04-23 01:10:29.511052166 +0000 UTC m=+28.685847777 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret") pod "global-pull-secret-syncer-7hf77" (UID: "4230f793-aa40-4a76-8cb9-6d4d5a0ea43b") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:21.526649 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.526608 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f5qrx" podStartSLOduration=3.387539502 podStartE2EDuration="20.526593754s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.043819337 +0000 UTC m=+3.218614947" lastFinishedPulling="2026-04-23 01:10:21.182873589 +0000 UTC m=+20.357669199" observedRunningTime="2026-04-23 01:10:21.526157702 +0000 UTC m=+20.700953345" watchObservedRunningTime="2026-04-23 01:10:21.526593754 +0000 UTC m=+20.701389386" Apr 23 01:10:21.539576 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:21.539534 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7nslb" podStartSLOduration=11.617791642 podStartE2EDuration="20.539519931s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.043763887 +0000 UTC m=+3.218559496" lastFinishedPulling="2026-04-23 01:10:12.965492161 +0000 UTC m=+12.140287785" observedRunningTime="2026-04-23 01:10:21.539291992 +0000 UTC m=+20.714087624" watchObservedRunningTime="2026-04-23 01:10:21.539519931 +0000 UTC m=+20.714315541" Apr 23 01:10:22.497492 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.497274 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf4044d9-01da-465d-a2bf-80556b56473d" containerID="6e7ce037a4449c4fe9d1c7bab4ceacc2b9b18f8b73d504efd5ad0242bd9b6470" exitCode=0 Apr 23 01:10:22.498260 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.497351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerDied","Data":"6e7ce037a4449c4fe9d1c7bab4ceacc2b9b18f8b73d504efd5ad0242bd9b6470"} Apr 23 01:10:22.498954 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.498929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l9kb7" event={"ID":"fa213e84-39a9-48e0-8583-16baa3f074dc","Type":"ContainerStarted","Data":"b43c6cc073eb5d9984c8071997442bba0fabb6cc090e03a07d4cd27d9551ebd4"} Apr 23 01:10:22.504373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.504345 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:10:22.504721 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.504700 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c95cbe5-ed9d-499f-b53a-66de0e3475e6" containerID="bb79cab1383799cadc4b521e7a6347e7cdb0dee0e08803d0b6a9549a99a46c47" exitCode=1 Apr 23 01:10:22.504805 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.504750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"51ebf41a4c38b8e10592bcf75e1e9c22c8613e6696f6f209a504738dd0fb5be8"} Apr 23 01:10:22.504805 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.504780 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"a26520809a37839969530d8d2ab57eb8efffe5031f6ed102283132cf1438d47d"} Apr 23 01:10:22.504805 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.504789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"f21def322ca39e6913b7930b4138e7ede4879b93d0366847581810fda1a6be98"} Apr 23 01:10:22.504805 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.504797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"8c9aca37a48c19e97072c29048b6dfc9eb31ebdf2f839c96ea630f2aac2c309f"} Apr 23 01:10:22.504805 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.504805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerDied","Data":"bb79cab1383799cadc4b521e7a6347e7cdb0dee0e08803d0b6a9549a99a46c47"} Apr 23 01:10:22.505011 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.504820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"1f82bc13f61b16bcc51ba7be3af798ee9df504bfe7478792572fa1baed394eae"} Apr 23 01:10:22.506033 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.506007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tsrhr" event={"ID":"e8cf828b-7774-4149-88cf-e198ac5cf943","Type":"ContainerStarted","Data":"a79ae907bd80f9710bb5e75ebd4b8418fd27324927245b7b030406f3d9aac9ee"} Apr 23 01:10:22.507194 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.507170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mggnc" event={"ID":"13504342-083f-4f36-abd4-a1b6558edb3f","Type":"ContainerStarted","Data":"ed3fee5cd527e9145e6a087bc32a90de442ba7a592dc602c7a20ba7c5228f05e"} Apr 23 01:10:22.544712 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.544657 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6r6mw" podStartSLOduration=4.41952442 podStartE2EDuration="21.544638995s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.043817216 +0000 UTC m=+3.218612840" lastFinishedPulling="2026-04-23 01:10:21.16893179 +0000 UTC m=+20.343727415" observedRunningTime="2026-04-23 01:10:22.529906805 +0000 UTC m=+21.704702436" watchObservedRunningTime="2026-04-23 01:10:22.544638995 +0000 UTC m=+21.719434627" Apr 23 01:10:22.545083 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.545051 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tsrhr" podStartSLOduration=4.451164701 podStartE2EDuration="21.545044456s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.048151005 +0000 UTC m=+3.222946614" lastFinishedPulling="2026-04-23 01:10:21.142030756 +0000 UTC m=+20.316826369" observedRunningTime="2026-04-23 01:10:22.544542368 +0000 UTC m=+21.719338000" watchObservedRunningTime="2026-04-23 01:10:22.545044456 +0000 UTC m=+21.719840166" Apr 23 01:10:22.568592 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.568535 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mggnc" podStartSLOduration=4.448098476 podStartE2EDuration="21.568515084s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.047079754 +0000 UTC m=+3.221875365" lastFinishedPulling="2026-04-23 01:10:21.167496358 +0000 UTC m=+20.342291973" observedRunningTime="2026-04-23 01:10:22.556355203 +0000 UTC m=+21.731150839" watchObservedRunningTime="2026-04-23 01:10:22.568515084 +0000 UTC m=+21.743310717" Apr 23 01:10:22.654757 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:22.654732 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 01:10:23.331232 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:23.331122 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T01:10:22.654749955Z","UUID":"c5a141ca-4188-4b47-b472-69f75b695e1a","Handler":null,"Name":"","Endpoint":""} Apr 23 01:10:23.334893 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:23.334640 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 01:10:23.334893 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:23.334677 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 01:10:23.380841 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:23.380807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:23.381050 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:23.380938 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:23.381581 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:23.381344 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:23.381581 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:23.381457 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:23.381581 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:23.381531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:23.381789 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:23.381754 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:23.511478 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:23.511442 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" event={"ID":"9b3a130d-355a-4f19-ae09-87c95553254c","Type":"ContainerStarted","Data":"78ea63f4d23936476a361f74b429f4e7faa4d5d658ef570836afe7e048de5ccb"} Apr 23 01:10:24.517167 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:24.517087 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:10:24.517591 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:24.517437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"da8cd135b47033ebd2b9346b46a8a3dc6d2919f1febe904f0fe74df37289c9f9"} Apr 23 01:10:24.519446 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:24.519404 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" event={"ID":"9b3a130d-355a-4f19-ae09-87c95553254c","Type":"ContainerStarted","Data":"aba71127d0e25406c4baa389461c7e31126aff8ae4e683b5547f94760f74bedc"} Apr 23 01:10:24.534690 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:24.534641 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-l9kb7" podStartSLOduration=6.430092513 podStartE2EDuration="23.534625445s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.037036741 +0000 UTC m=+3.211832354" lastFinishedPulling="2026-04-23 01:10:21.141569662 +0000 UTC m=+20.316365286" observedRunningTime="2026-04-23 01:10:22.568347108 +0000 UTC m=+21.743142762" watchObservedRunningTime="2026-04-23 01:10:24.534625445 +0000 UTC m=+23.709421124" Apr 23 01:10:24.534856 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:24.534768 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wm92" podStartSLOduration=3.663327185 podStartE2EDuration="23.534758655s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.048880637 +0000 UTC m=+3.223676246" lastFinishedPulling="2026-04-23 01:10:23.920312093 +0000 UTC m=+23.095107716" observedRunningTime="2026-04-23 01:10:24.534529206 +0000 UTC m=+23.709324839" watchObservedRunningTime="2026-04-23 01:10:24.534758655 +0000 UTC m=+23.709554288" Apr 23 01:10:25.380273 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:25.380009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:25.380513 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:25.380018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:25.380513 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:25.380384 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:25.380513 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:25.380083 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:25.380513 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:25.380456 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:25.380726 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:25.380526 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:26.048351 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:26.048314 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:26.049018 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:26.048995 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:27.380054 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.379854 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:27.380829 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.379860 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:27.380829 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:27.380140 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:27.380829 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.379887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:27.380829 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:27.380192 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:27.380829 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:27.380267 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:27.525514 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.525484 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf4044d9-01da-465d-a2bf-80556b56473d" containerID="6ec6a4b14bf8086fc3e03a2b70d7119f252b48a7c96257a196c0f29b636ccd85" exitCode=0 Apr 23 01:10:27.525711 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.525559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerDied","Data":"6ec6a4b14bf8086fc3e03a2b70d7119f252b48a7c96257a196c0f29b636ccd85"} Apr 23 01:10:27.534719 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.534696 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:10:27.535073 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.535052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"5fcf12616fd519f57ef057c01cdde4213ad344c67fc319a400d1fa2fb4b59ee6"} Apr 23 01:10:27.535460 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.535436 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:27.535571 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.535554 2576 scope.go:117] "RemoveContainer" containerID="bb79cab1383799cadc4b521e7a6347e7cdb0dee0e08803d0b6a9549a99a46c47" Apr 23 01:10:27.551751 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:27.551728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:28.446169 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.446138 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7hf77"] Apr 23 01:10:28.446642 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.446290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:28.446642 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:28.446390 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:28.449841 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.449728 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2qwsm"] Apr 23 01:10:28.450076 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.450043 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:28.450199 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:28.450156 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:28.450434 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.450411 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-755gj"] Apr 23 01:10:28.450537 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.450522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:28.450658 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:28.450624 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:28.540790 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.540762 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:10:28.541116 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.541092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" event={"ID":"5c95cbe5-ed9d-499f-b53a-66de0e3475e6","Type":"ContainerStarted","Data":"a0e00349f74e5a5a3342c45ecb01a56f17e171b35d6e6678d098bc944d20d766"} Apr 23 01:10:28.541342 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.541326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:28.541454 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.541437 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:28.556567 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.556544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:10:28.565471 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:28.565436 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" podStartSLOduration=10.384476499 podStartE2EDuration="27.565421984s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.037024666 +0000 UTC m=+3.211820276" lastFinishedPulling="2026-04-23 01:10:21.217970134 +0000 UTC m=+20.392765761" observedRunningTime="2026-04-23 01:10:28.563949437 +0000 UTC m=+27.738745068" watchObservedRunningTime="2026-04-23 01:10:28.565421984 +0000 UTC m=+27.740217616" Apr 23 01:10:29.170071 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:29.169799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:29.170243 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:29.170188 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 01:10:29.170568 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:29.170551 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tsrhr" Apr 23 01:10:29.544994 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:29.544958 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf4044d9-01da-465d-a2bf-80556b56473d" containerID="1906549d2eb09641019e760fd458ca82cd59c6ec463c8bcebe756bea3c4e4817" exitCode=0 Apr 23 01:10:29.545442 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:29.545046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerDied","Data":"1906549d2eb09641019e760fd458ca82cd59c6ec463c8bcebe756bea3c4e4817"} Apr 23 01:10:29.573713 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:29.573687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:29.573861 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:29.573782 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:29.573861 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:29.573828 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret podName:4230f793-aa40-4a76-8cb9-6d4d5a0ea43b nodeName:}" failed. No retries permitted until 2026-04-23 01:10:45.573813859 +0000 UTC m=+44.748609478 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret") pod "global-pull-secret-syncer-7hf77" (UID: "4230f793-aa40-4a76-8cb9-6d4d5a0ea43b") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:30.380720 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:30.380688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:30.380896 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:30.380688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:30.380896 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:30.380787 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:30.380896 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:30.380687 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:30.381100 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:30.380938 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:30.381100 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:30.381051 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:30.549213 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:30.549181 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf4044d9-01da-465d-a2bf-80556b56473d" containerID="e9973f16307475e1ee18ef8b34eb3fbf9983a6ef094018c59f5b27220ca51790" exitCode=0 Apr 23 01:10:30.549695 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:30.549268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerDied","Data":"e9973f16307475e1ee18ef8b34eb3fbf9983a6ef094018c59f5b27220ca51790"} Apr 23 01:10:32.380665 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:32.380622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:32.380665 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:32.380622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:32.381309 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:32.380644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:32.381309 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:32.380762 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:32.381309 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:32.380866 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:32.381309 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:32.380937 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:34.379900 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:34.379860 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:34.380540 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:34.379911 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:34.380540 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:34.379977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:34.380540 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:34.380008 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qwsm" podUID="482d8bfd-9099-4d00-ae13-b2428f266503" Apr 23 01:10:34.380540 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:34.380086 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-755gj" podUID="4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554" Apr 23 01:10:34.380540 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:34.380212 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7hf77" podUID="4230f793-aa40-4a76-8cb9-6d4d5a0ea43b" Apr 23 01:10:35.014449 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.014405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:35.014648 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.014584 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:35.014719 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.014666 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs podName:4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:07.014644591 +0000 UTC m=+66.189440203 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs") pod "network-metrics-daemon-755gj" (UID: "4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:35.115431 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.115382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:35.115604 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.115572 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:35.115604 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.115597 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:35.115701 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.115611 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t9jqp for pod openshift-network-diagnostics/network-check-target-2qwsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:35.115701 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.115694 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp podName:482d8bfd-9099-4d00-ae13-b2428f266503 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:07.115675401 +0000 UTC m=+66.290471014 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-t9jqp" (UniqueName: "kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp") pod "network-check-target-2qwsm" (UID: "482d8bfd-9099-4d00-ae13-b2428f266503") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:35.155624 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.155586 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-21.ec2.internal" event="NodeReady" Apr 23 01:10:35.155798 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.155763 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 01:10:35.197886 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.197853 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kdm6b"] Apr 23 01:10:35.200942 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.200882 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.203461 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.203436 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 01:10:35.203603 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.203470 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4q4b\"" Apr 23 01:10:35.203603 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.203442 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 01:10:35.210967 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.210769 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kgfvc"] Apr 23 01:10:35.213464 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.213426 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kdm6b"] Apr 23 01:10:35.213595 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.213566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:35.216098 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.216073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 01:10:35.216213 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.216154 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jjv8h\"" Apr 23 01:10:35.216291 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.216081 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 01:10:35.216409 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.216398 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 01:10:35.225241 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.225208 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kgfvc"] Apr 23 01:10:35.317040 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.317003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:35.317196 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.317087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.317196 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.317131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-tmp-dir\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.317196 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.317156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8gdl\" (UniqueName: \"kubernetes.io/projected/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-kube-api-access-k8gdl\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:35.317377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.317285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-config-volume\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.317377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.317323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qpl\" (UniqueName: \"kubernetes.io/projected/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-kube-api-access-r7qpl\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.417688 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.417653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-config-volume\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.417688 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.417692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qpl\" (UniqueName: \"kubernetes.io/projected/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-kube-api-access-r7qpl\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.417720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.417783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.417837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-tmp-dir\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.417862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gdl\" (UniqueName: \"kubernetes.io/projected/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-kube-api-access-k8gdl\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.417957 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.418046 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls podName:ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:35.918022749 +0000 UTC m=+35.092818365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls") pod "dns-default-kdm6b" (UID: "ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d") : secret "dns-default-metrics-tls" not found Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.418122 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.418198 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert podName:16cbb3a1-0ddd-4793-8d37-07bfa2a0568d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:35.918174544 +0000 UTC m=+35.092970155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert") pod "ingress-canary-kgfvc" (UID: "16cbb3a1-0ddd-4793-8d37-07bfa2a0568d") : secret "canary-serving-cert" not found Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.418383 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-config-volume\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.418428 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.418427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-tmp-dir\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.430425 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.430397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qpl\" (UniqueName: \"kubernetes.io/projected/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-kube-api-access-r7qpl\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.430549 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.430481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8gdl\" (UniqueName: \"kubernetes.io/projected/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-kube-api-access-k8gdl\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:35.921809 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.921764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:35.922024 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:35.921849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:35.922024 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.921936 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:35.922024 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.921967 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:35.922198 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.922035 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert podName:16cbb3a1-0ddd-4793-8d37-07bfa2a0568d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:36.922012612 +0000 UTC m=+36.096808227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert") pod "ingress-canary-kgfvc" (UID: "16cbb3a1-0ddd-4793-8d37-07bfa2a0568d") : secret "canary-serving-cert" not found Apr 23 01:10:35.922198 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:35.922058 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls podName:ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:36.922047499 +0000 UTC m=+36.096843109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls") pod "dns-default-kdm6b" (UID: "ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d") : secret "dns-default-metrics-tls" not found Apr 23 01:10:36.380586 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.380555 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:36.380724 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.380555 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:10:36.381107 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.380561 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:10:36.383001 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.382965 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 01:10:36.383796 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.383777 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 01:10:36.383904 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.383819 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 01:10:36.383904 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.383890 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj44d\"" Apr 23 01:10:36.384044 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.383893 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hjv94\"" Apr 23 01:10:36.384044 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.383897 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 01:10:36.564727 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.564692 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf4044d9-01da-465d-a2bf-80556b56473d" containerID="9c47221f5a1cc3960edf2711ff8892b0e021db23752624cc2037b3e310a45527" exitCode=0 Apr 23 01:10:36.565213 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.564748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerDied","Data":"9c47221f5a1cc3960edf2711ff8892b0e021db23752624cc2037b3e310a45527"} Apr 23 01:10:36.929221 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.929177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:36.929396 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:36.929263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:36.929396 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:36.929320 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:36.929396 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:36.929387 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert podName:16cbb3a1-0ddd-4793-8d37-07bfa2a0568d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:38.929371666 +0000 UTC m=+38.104167275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert") pod "ingress-canary-kgfvc" (UID: "16cbb3a1-0ddd-4793-8d37-07bfa2a0568d") : secret "canary-serving-cert" not found Apr 23 01:10:36.929550 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:36.929411 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:36.929550 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:36.929477 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls podName:ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:38.929458424 +0000 UTC m=+38.104254055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls") pod "dns-default-kdm6b" (UID: "ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d") : secret "dns-default-metrics-tls" not found Apr 23 01:10:37.570201 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:37.570167 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf4044d9-01da-465d-a2bf-80556b56473d" containerID="59fd97fdcc52e5d7c2dc2965646bad81d5e7e0500247855239a8688de2cdd0a5" exitCode=0 Apr 23 01:10:37.570591 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:37.570243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerDied","Data":"59fd97fdcc52e5d7c2dc2965646bad81d5e7e0500247855239a8688de2cdd0a5"} Apr 23 01:10:38.575646 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:38.575384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" event={"ID":"bf4044d9-01da-465d-a2bf-80556b56473d","Type":"ContainerStarted","Data":"16705b0ac27e50f5302090a5a0e3af2429d164f33768fb442aaeeaad5a4d4d68"} Apr 23 01:10:38.595930 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:38.595876 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n9nhd" podStartSLOduration=5.436749746 podStartE2EDuration="37.595858581s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:04.049162311 +0000 UTC m=+3.223957924" lastFinishedPulling="2026-04-23 01:10:36.20827115 +0000 UTC m=+35.383066759" observedRunningTime="2026-04-23 01:10:38.595141845 +0000 UTC m=+37.769937478" watchObservedRunningTime="2026-04-23 01:10:38.595858581 +0000 UTC m=+37.770654212" Apr 23 01:10:38.945265 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:38.945176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:38.945265 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:38.945230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:38.945454 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:38.945324 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:38.945454 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:38.945328 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:38.945454 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:38.945380 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls podName:ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:42.945366485 +0000 UTC m=+42.120162095 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls") pod "dns-default-kdm6b" (UID: "ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d") : secret "dns-default-metrics-tls" not found Apr 23 01:10:38.945454 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:38.945394 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert podName:16cbb3a1-0ddd-4793-8d37-07bfa2a0568d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:42.945387756 +0000 UTC m=+42.120183367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert") pod "ingress-canary-kgfvc" (UID: "16cbb3a1-0ddd-4793-8d37-07bfa2a0568d") : secret "canary-serving-cert" not found Apr 23 01:10:42.976091 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:42.976051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:42.976550 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:42.976109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:42.976550 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:42.976204 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:42.976550 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:42.976271 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert podName:16cbb3a1-0ddd-4793-8d37-07bfa2a0568d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:50.976254684 +0000 UTC m=+50.151050296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert") pod "ingress-canary-kgfvc" (UID: "16cbb3a1-0ddd-4793-8d37-07bfa2a0568d") : secret "canary-serving-cert" not found Apr 23 01:10:42.976550 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:42.976214 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:42.976550 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:42.976347 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls podName:ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:50.97633477 +0000 UTC m=+50.151130380 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls") pod "dns-default-kdm6b" (UID: "ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d") : secret "dns-default-metrics-tls" not found Apr 23 01:10:45.594418 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:45.594381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:45.597484 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:45.597464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4230f793-aa40-4a76-8cb9-6d4d5a0ea43b-original-pull-secret\") pod \"global-pull-secret-syncer-7hf77\" (UID: \"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b\") " pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:45.691738 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:45.691700 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7hf77" Apr 23 01:10:45.834976 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:45.834940 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7hf77"] Apr 23 01:10:45.838344 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:10:45.838312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4230f793_aa40_4a76_8cb9_6d4d5a0ea43b.slice/crio-139271d4abdad3db230b308e04b3889dfdb0d0dc6fc618f3dc30703d653f7eec WatchSource:0}: Error finding container 139271d4abdad3db230b308e04b3889dfdb0d0dc6fc618f3dc30703d653f7eec: Status 404 returned error can't find the container with id 139271d4abdad3db230b308e04b3889dfdb0d0dc6fc618f3dc30703d653f7eec Apr 23 01:10:46.590679 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:46.590641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7hf77" event={"ID":"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b","Type":"ContainerStarted","Data":"139271d4abdad3db230b308e04b3889dfdb0d0dc6fc618f3dc30703d653f7eec"} Apr 23 01:10:50.600373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:50.600333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7hf77" event={"ID":"4230f793-aa40-4a76-8cb9-6d4d5a0ea43b","Type":"ContainerStarted","Data":"08783e6b40879227455da8cd7d1308dd08546f8d48d02003daa90eba87eb3ca4"} Apr 23 01:10:50.613574 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:50.613520 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7hf77" podStartSLOduration=33.509236696 podStartE2EDuration="37.613501962s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:45.840083148 +0000 UTC m=+45.014878759" lastFinishedPulling="2026-04-23 01:10:49.944348413 +0000 UTC m=+49.119144025" observedRunningTime="2026-04-23 01:10:50.613262005 +0000 UTC m=+49.788057637" watchObservedRunningTime="2026-04-23 01:10:50.613501962 +0000 UTC m=+49.788297596" Apr 23 01:10:51.036614 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:51.036581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:10:51.036777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:10:51.036635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:10:51.036777 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:51.036731 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:51.036777 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:51.036733 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:51.036878 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:51.036781 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls podName:ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d nodeName:}" failed. No retries permitted until 2026-04-23 01:11:07.036767448 +0000 UTC m=+66.211563057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls") pod "dns-default-kdm6b" (UID: "ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d") : secret "dns-default-metrics-tls" not found Apr 23 01:10:51.036878 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:10:51.036795 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert podName:16cbb3a1-0ddd-4793-8d37-07bfa2a0568d nodeName:}" failed. No retries permitted until 2026-04-23 01:11:07.036789331 +0000 UTC m=+66.211584940 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert") pod "ingress-canary-kgfvc" (UID: "16cbb3a1-0ddd-4793-8d37-07bfa2a0568d") : secret "canary-serving-cert" not found Apr 23 01:11:00.559846 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:00.559818 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rfgs" Apr 23 01:11:04.294531 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.294498 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj"] Apr 23 01:11:04.311508 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.311456 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj"] Apr 23 01:11:04.311687 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.311579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.315587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.315563 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 01:11:04.316007 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.315972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 01:11:04.316127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.316015 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 01:11:04.316127 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.316038 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 01:11:04.316382 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.316361 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 01:11:04.316382 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.316373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 01:11:04.316506 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.316391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 01:11:04.325240 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.325220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k88g\" (UniqueName: \"kubernetes.io/projected/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-kube-api-access-7k88g\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.325348 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.325253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-hub\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.325348 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.325284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.325348 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.325330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.325477 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.325409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-ca\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.325477 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.325425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.426396 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.426362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.426396 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.426401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.426651 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.426511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-ca\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.426651 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.426542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.426651 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.426561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k88g\" (UniqueName: \"kubernetes.io/projected/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-kube-api-access-7k88g\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.426651 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.426585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-hub\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.427361 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.427333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.429086 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.429035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.429197 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.429104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.429197 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.429111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-hub\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.429197 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.429120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-ca\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.438501 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.438479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k88g\" (UniqueName: \"kubernetes.io/projected/6f507ae4-dd6e-4916-a07e-ec5ea46ca270-kube-api-access-7k88g\") pod \"cluster-proxy-proxy-agent-56ccc7b55f-st6dj\" (UID: \"6f507ae4-dd6e-4916-a07e-ec5ea46ca270\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.631713 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.631627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:11:04.740301 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:04.740274 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj"] Apr 23 01:11:04.746761 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:11:04.746733 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f507ae4_dd6e_4916_a07e_ec5ea46ca270.slice/crio-9986a430a6c689d5d48d0a6c4dc59997caed9c03fb896840c895198857f22352 WatchSource:0}: Error finding container 9986a430a6c689d5d48d0a6c4dc59997caed9c03fb896840c895198857f22352: Status 404 returned error can't find the container with id 9986a430a6c689d5d48d0a6c4dc59997caed9c03fb896840c895198857f22352 Apr 23 01:11:05.628570 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:05.628534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" event={"ID":"6f507ae4-dd6e-4916-a07e-ec5ea46ca270","Type":"ContainerStarted","Data":"9986a430a6c689d5d48d0a6c4dc59997caed9c03fb896840c895198857f22352"} Apr 23 01:11:07.048016 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.047952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:11:07.048574 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.048039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:11:07.048574 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.048085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:11:07.048574 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:07.048176 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:11:07.048574 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:07.048210 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:11:07.048574 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:07.048270 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls podName:ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d nodeName:}" failed. No retries permitted until 2026-04-23 01:11:39.048252951 +0000 UTC m=+98.223048560 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls") pod "dns-default-kdm6b" (UID: "ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d") : secret "dns-default-metrics-tls" not found Apr 23 01:11:07.048574 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:07.048289 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert podName:16cbb3a1-0ddd-4793-8d37-07bfa2a0568d nodeName:}" failed. No retries permitted until 2026-04-23 01:11:39.048279845 +0000 UTC m=+98.223075456 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert") pod "ingress-canary-kgfvc" (UID: "16cbb3a1-0ddd-4793-8d37-07bfa2a0568d") : secret "canary-serving-cert" not found Apr 23 01:11:07.050301 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.050275 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 01:11:07.059182 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:07.059150 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 01:11:07.059350 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:07.059249 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs podName:4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:11.059225272 +0000 UTC m=+130.234020885 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs") pod "network-metrics-daemon-755gj" (UID: "4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554") : secret "metrics-daemon-secret" not found Apr 23 01:11:07.149093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.149062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:11:07.151599 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.151567 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 01:11:07.161522 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.161496 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 01:11:07.172946 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.172917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9jqp\" (UniqueName: \"kubernetes.io/projected/482d8bfd-9099-4d00-ae13-b2428f266503-kube-api-access-t9jqp\") pod \"network-check-target-2qwsm\" (UID: \"482d8bfd-9099-4d00-ae13-b2428f266503\") " pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:11:07.304514 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.304436 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hjv94\"" Apr 23 01:11:07.313245 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.313212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:11:07.828020 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:07.827963 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2qwsm"] Apr 23 01:11:07.832962 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:11:07.832926 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482d8bfd_9099_4d00_ae13_b2428f266503.slice/crio-676c34b69af18be3ac30e97d79bdcb3cd3fa7eed421e254b1a0f123a3e9f4f13 WatchSource:0}: Error finding container 676c34b69af18be3ac30e97d79bdcb3cd3fa7eed421e254b1a0f123a3e9f4f13: Status 404 returned error can't find the container with id 676c34b69af18be3ac30e97d79bdcb3cd3fa7eed421e254b1a0f123a3e9f4f13 Apr 23 01:11:08.636973 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:08.636928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" event={"ID":"6f507ae4-dd6e-4916-a07e-ec5ea46ca270","Type":"ContainerStarted","Data":"ea986235c84a8a76a8ab4f42a6cd5cac71efc6a807ff4d6729b315637c0239bb"} Apr 23 01:11:08.638109 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:08.638070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2qwsm" event={"ID":"482d8bfd-9099-4d00-ae13-b2428f266503","Type":"ContainerStarted","Data":"676c34b69af18be3ac30e97d79bdcb3cd3fa7eed421e254b1a0f123a3e9f4f13"} Apr 23 01:11:12.648025 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:12.647964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" event={"ID":"6f507ae4-dd6e-4916-a07e-ec5ea46ca270","Type":"ContainerStarted","Data":"5e6a0d5665295384fa5f088c6fb3e674b7cf5f8d829cfadacd6c922b851d940d"} Apr 23 01:11:12.648025 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:12.648028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" event={"ID":"6f507ae4-dd6e-4916-a07e-ec5ea46ca270","Type":"ContainerStarted","Data":"86c524b33bd4d41b6b46e74892a40938c74a812d987205bb4e5e918b112b21ba"} Apr 23 01:11:12.649322 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:12.649300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2qwsm" event={"ID":"482d8bfd-9099-4d00-ae13-b2428f266503","Type":"ContainerStarted","Data":"70bee50fe254ebd6dc99d8bf50337d188c0adefea9ff928834343b72c5b9c71f"} Apr 23 01:11:12.649444 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:12.649432 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:11:12.664949 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:12.664897 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" podStartSLOduration=1.8074928080000001 podStartE2EDuration="8.664884226s" podCreationTimestamp="2026-04-23 01:11:04 +0000 UTC" firstStartedPulling="2026-04-23 01:11:04.748506996 +0000 UTC m=+63.923302605" lastFinishedPulling="2026-04-23 01:11:11.605898398 +0000 UTC m=+70.780694023" observedRunningTime="2026-04-23 01:11:12.663946422 +0000 UTC m=+71.838742055" watchObservedRunningTime="2026-04-23 01:11:12.664884226 +0000 UTC m=+71.839679913" Apr 23 01:11:12.676660 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:12.676614 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2qwsm" podStartSLOduration=67.89683142 podStartE2EDuration="1m11.676599468s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:11:07.835798677 +0000 UTC m=+67.010594287" lastFinishedPulling="2026-04-23 01:11:11.615566722 +0000 UTC m=+70.790362335" observedRunningTime="2026-04-23 01:11:12.676068894 +0000 UTC m=+71.850864526" watchObservedRunningTime="2026-04-23 01:11:12.676599468 +0000 UTC m=+71.851395079" Apr 23 01:11:39.072469 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:39.072432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:11:39.072469 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:39.072473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:11:39.072975 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:39.072576 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:11:39.072975 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:39.072592 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:11:39.072975 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:39.072629 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls podName:ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d nodeName:}" failed. No retries permitted until 2026-04-23 01:12:43.072615855 +0000 UTC m=+162.247411470 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls") pod "dns-default-kdm6b" (UID: "ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d") : secret "dns-default-metrics-tls" not found Apr 23 01:11:39.072975 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:11:39.072671 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert podName:16cbb3a1-0ddd-4793-8d37-07bfa2a0568d nodeName:}" failed. No retries permitted until 2026-04-23 01:12:43.072650949 +0000 UTC m=+162.247446561 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert") pod "ingress-canary-kgfvc" (UID: "16cbb3a1-0ddd-4793-8d37-07bfa2a0568d") : secret "canary-serving-cert" not found Apr 23 01:11:43.654303 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:43.654272 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2qwsm" Apr 23 01:11:47.981080 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:47.981051 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mggnc_13504342-083f-4f36-abd4-a1b6558edb3f/dns-node-resolver/0.log" Apr 23 01:11:48.581369 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:48.581338 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7nslb_b0608d8b-418d-4240-b84e-dc09071f45b7/node-ca/0.log" Apr 23 01:11:56.776051 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:56.776012 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg"] Apr 23 01:11:56.780491 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:56.780467 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" Apr 23 01:11:56.782759 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:56.782735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 01:11:56.782759 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:56.782748 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 01:11:56.783539 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:56.783520 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-gxlwr\"" Apr 23 01:11:56.785839 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:56.785814 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg"] Apr 23 01:11:56.896124 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:56.896095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5f6v\" (UniqueName: \"kubernetes.io/projected/99c1f9db-3073-4152-9221-8661b8a6f579-kube-api-access-l5f6v\") pod \"migrator-74bb7799d9-7hplg\" (UID: \"99c1f9db-3073-4152-9221-8661b8a6f579\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" Apr 23 01:11:56.997416 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:56.997380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5f6v\" (UniqueName: \"kubernetes.io/projected/99c1f9db-3073-4152-9221-8661b8a6f579-kube-api-access-l5f6v\") pod \"migrator-74bb7799d9-7hplg\" (UID: \"99c1f9db-3073-4152-9221-8661b8a6f579\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" Apr 23 01:11:57.005150 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:57.005121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5f6v\" (UniqueName: \"kubernetes.io/projected/99c1f9db-3073-4152-9221-8661b8a6f579-kube-api-access-l5f6v\") pod \"migrator-74bb7799d9-7hplg\" (UID: \"99c1f9db-3073-4152-9221-8661b8a6f579\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" Apr 23 01:11:57.089869 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:57.089787 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" Apr 23 01:11:57.203287 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:57.203256 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg"] Apr 23 01:11:57.206952 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:11:57.206924 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c1f9db_3073_4152_9221_8661b8a6f579.slice/crio-8f228af64c17b93ffb860a154416ff6846a309076461df5b070d1785b5542aaa WatchSource:0}: Error finding container 8f228af64c17b93ffb860a154416ff6846a309076461df5b070d1785b5542aaa: Status 404 returned error can't find the container with id 8f228af64c17b93ffb860a154416ff6846a309076461df5b070d1785b5542aaa Apr 23 01:11:57.736620 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:57.736585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" event={"ID":"99c1f9db-3073-4152-9221-8661b8a6f579","Type":"ContainerStarted","Data":"8f228af64c17b93ffb860a154416ff6846a309076461df5b070d1785b5542aaa"} Apr 23 01:11:58.742215 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:58.742169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" event={"ID":"99c1f9db-3073-4152-9221-8661b8a6f579","Type":"ContainerStarted","Data":"8a613ccf9ddc0d12d22a74eb47ffc690416bd2ae7f7ba62c6d3423fadb94001c"} Apr 23 01:11:58.742594 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:58.742221 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" event={"ID":"99c1f9db-3073-4152-9221-8661b8a6f579","Type":"ContainerStarted","Data":"e9480e62ffce6eb86a75b2ef67a32afbc310f152403fcdd6a71814ba8d784235"} Apr 23 01:11:58.756679 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:11:58.756632 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7hplg" podStartSLOduration=1.58745099 podStartE2EDuration="2.756617298s" podCreationTimestamp="2026-04-23 01:11:56 +0000 UTC" firstStartedPulling="2026-04-23 01:11:57.208727176 +0000 UTC m=+116.383522786" lastFinishedPulling="2026-04-23 01:11:58.377893477 +0000 UTC m=+117.552689094" observedRunningTime="2026-04-23 01:11:58.755460547 +0000 UTC m=+117.930256178" watchObservedRunningTime="2026-04-23 01:11:58.756617298 +0000 UTC m=+117.931412929" Apr 23 01:12:11.096741 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:11.096683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:12:11.099008 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:11.098973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554-metrics-certs\") pod \"network-metrics-daemon-755gj\" (UID: \"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554\") " pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:12:11.208865 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:11.208829 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj44d\"" Apr 23 01:12:11.217122 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:11.217093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-755gj" Apr 23 01:12:11.331550 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:11.331519 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-755gj"] Apr 23 01:12:11.335271 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:11.335236 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fa93c6a_0bcb_4e1d_abf9_ae8f8aa58554.slice/crio-8936e98e2ae8e10dca6d9e1a896ede790d5ef41b5c333e1e9a598c8ae09dacf0 WatchSource:0}: Error finding container 8936e98e2ae8e10dca6d9e1a896ede790d5ef41b5c333e1e9a598c8ae09dacf0: Status 404 returned error can't find the container with id 8936e98e2ae8e10dca6d9e1a896ede790d5ef41b5c333e1e9a598c8ae09dacf0 Apr 23 01:12:11.775826 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:11.775789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-755gj" event={"ID":"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554","Type":"ContainerStarted","Data":"8936e98e2ae8e10dca6d9e1a896ede790d5ef41b5c333e1e9a598c8ae09dacf0"} Apr 23 01:12:12.779485 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:12.779449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-755gj" event={"ID":"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554","Type":"ContainerStarted","Data":"4b168a910568ff9399743c4c5defdda19c3fc50759a033a9b9d3645e04a2aff1"} Apr 23 01:12:12.779485 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:12.779491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-755gj" event={"ID":"4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554","Type":"ContainerStarted","Data":"fe3767be8e279c478f675888373a7c6379fae9e5a9b70579aa7d04b64eade6f8"} Apr 23 01:12:12.799072 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:12.798975 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-755gj" podStartSLOduration=130.693440021 podStartE2EDuration="2m11.798958317s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:12:11.337107638 +0000 UTC m=+130.511903249" lastFinishedPulling="2026-04-23 01:12:12.44262592 +0000 UTC m=+131.617421545" observedRunningTime="2026-04-23 01:12:12.798010981 +0000 UTC m=+131.972806613" watchObservedRunningTime="2026-04-23 01:12:12.798958317 +0000 UTC m=+131.973753991" Apr 23 01:12:20.393377 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.393348 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kz8x2"] Apr 23 01:12:20.396549 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.396524 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.399094 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.399067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8pwvt\"" Apr 23 01:12:20.399254 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.399072 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 01:12:20.399915 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.399893 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 01:12:20.400047 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.399899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 01:12:20.400047 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.399947 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 01:12:20.407618 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.407592 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kz8x2"] Apr 23 01:12:20.464967 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.464931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mprmf\" (UniqueName: \"kubernetes.io/projected/e96552f1-4f6d-44b8-9597-3fe4faed4577-kube-api-access-mprmf\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.465202 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.465038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e96552f1-4f6d-44b8-9597-3fe4faed4577-crio-socket\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.465202 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.465085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e96552f1-4f6d-44b8-9597-3fe4faed4577-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.465202 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.465115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e96552f1-4f6d-44b8-9597-3fe4faed4577-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.465202 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.465137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e96552f1-4f6d-44b8-9597-3fe4faed4577-data-volume\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.489538 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.489502 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-cdd5d6498-5qhzb"] Apr 23 01:12:20.492421 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.492406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.494628 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.494607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 01:12:20.494859 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.494841 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 01:12:20.494928 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.494892 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 01:12:20.495124 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.495111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ndfmc\"" Apr 23 01:12:20.500794 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.500773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 01:12:20.505110 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.505076 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-cdd5d6498-5qhzb"] Apr 23 01:12:20.539708 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.539675 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-cdd5d6498-5qhzb"] Apr 23 01:12:20.540188 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:20.539891 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-fk9bz registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-fk9bz registry-certificates registry-tls trusted-ca]: context canceled" pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" podUID="5d4a8759-3f47-48af-89d3-b901ffdf09f9" Apr 23 01:12:20.565404 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-tls\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.565404 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-installation-pull-secrets\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.565614 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-trusted-ca\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.565614 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-certificates\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.565614 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e96552f1-4f6d-44b8-9597-3fe4faed4577-crio-socket\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.565777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e96552f1-4f6d-44b8-9597-3fe4faed4577-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.565777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e96552f1-4f6d-44b8-9597-3fe4faed4577-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.565777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d4a8759-3f47-48af-89d3-b901ffdf09f9-ca-trust-extracted\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.565777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk9bz\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-kube-api-access-fk9bz\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.565777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e96552f1-4f6d-44b8-9597-3fe4faed4577-crio-socket\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.565777 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e96552f1-4f6d-44b8-9597-3fe4faed4577-data-volume\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.566100 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-image-registry-private-configuration\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.566100 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mprmf\" (UniqueName: \"kubernetes.io/projected/e96552f1-4f6d-44b8-9597-3fe4faed4577-kube-api-access-mprmf\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.566100 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.565868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-bound-sa-token\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.566210 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.566097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e96552f1-4f6d-44b8-9597-3fe4faed4577-data-volume\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.566654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.566637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e96552f1-4f6d-44b8-9597-3fe4faed4577-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.567852 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.567834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e96552f1-4f6d-44b8-9597-3fe4faed4577-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.573782 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.573761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mprmf\" (UniqueName: \"kubernetes.io/projected/e96552f1-4f6d-44b8-9597-3fe4faed4577-kube-api-access-mprmf\") pod \"insights-runtime-extractor-kz8x2\" (UID: \"e96552f1-4f6d-44b8-9597-3fe4faed4577\") " pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.666305 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-bound-sa-token\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.666305 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-tls\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.666305 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-installation-pull-secrets\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.666582 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-trusted-ca\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.666582 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-certificates\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.666582 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d4a8759-3f47-48af-89d3-b901ffdf09f9-ca-trust-extracted\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.667084 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fk9bz\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-kube-api-access-fk9bz\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.667084 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-image-registry-private-configuration\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.667084 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.666936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d4a8759-3f47-48af-89d3-b901ffdf09f9-ca-trust-extracted\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.667420 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.667395 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-certificates\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.667507 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.667397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-trusted-ca\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.668817 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.668797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-tls\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.668902 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.668891 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-installation-pull-secrets\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.669117 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.669099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-image-registry-private-configuration\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.673524 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.673490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-bound-sa-token\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.673762 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.673739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk9bz\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-kube-api-access-fk9bz\") pod \"image-registry-cdd5d6498-5qhzb\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.706889 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.706852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kz8x2" Apr 23 01:12:20.801308 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.801270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.806198 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.806176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:20.823450 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.823423 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kz8x2"] Apr 23 01:12:20.826465 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:20.826435 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode96552f1_4f6d_44b8_9597_3fe4faed4577.slice/crio-7667259b208abd2a6eba671df00d8e8dc8026b98f28ef9a57537549a42bd0c35 WatchSource:0}: Error finding container 7667259b208abd2a6eba671df00d8e8dc8026b98f28ef9a57537549a42bd0c35: Status 404 returned error can't find the container with id 7667259b208abd2a6eba671df00d8e8dc8026b98f28ef9a57537549a42bd0c35 Apr 23 01:12:20.867834 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.867809 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d4a8759-3f47-48af-89d3-b901ffdf09f9-ca-trust-extracted\") pod \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " Apr 23 01:12:20.867966 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.867850 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-certificates\") pod \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " Apr 23 01:12:20.867966 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.867887 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-trusted-ca\") pod \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " Apr 23 01:12:20.867966 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.867918 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-image-registry-private-configuration\") pod \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " Apr 23 01:12:20.867966 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.867941 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-bound-sa-token\") pod \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " Apr 23 01:12:20.868188 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.868004 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-tls\") pod \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " Apr 23 01:12:20.868188 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.868030 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-installation-pull-secrets\") pod \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " Apr 23 01:12:20.868188 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.868073 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk9bz\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-kube-api-access-fk9bz\") pod \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\" (UID: \"5d4a8759-3f47-48af-89d3-b901ffdf09f9\") " Apr 23 01:12:20.868188 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.868133 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d4a8759-3f47-48af-89d3-b901ffdf09f9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5d4a8759-3f47-48af-89d3-b901ffdf09f9" (UID: "5d4a8759-3f47-48af-89d3-b901ffdf09f9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:12:20.868381 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.868318 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d4a8759-3f47-48af-89d3-b901ffdf09f9-ca-trust-extracted\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:12:20.868662 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.868458 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5d4a8759-3f47-48af-89d3-b901ffdf09f9" (UID: "5d4a8759-3f47-48af-89d3-b901ffdf09f9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:12:20.868662 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.868495 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5d4a8759-3f47-48af-89d3-b901ffdf09f9" (UID: "5d4a8759-3f47-48af-89d3-b901ffdf09f9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:12:20.870503 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.870475 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-kube-api-access-fk9bz" (OuterVolumeSpecName: "kube-api-access-fk9bz") pod "5d4a8759-3f47-48af-89d3-b901ffdf09f9" (UID: "5d4a8759-3f47-48af-89d3-b901ffdf09f9"). InnerVolumeSpecName "kube-api-access-fk9bz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:12:20.870759 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.870725 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5d4a8759-3f47-48af-89d3-b901ffdf09f9" (UID: "5d4a8759-3f47-48af-89d3-b901ffdf09f9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:12:20.870847 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.870767 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5d4a8759-3f47-48af-89d3-b901ffdf09f9" (UID: "5d4a8759-3f47-48af-89d3-b901ffdf09f9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:12:20.870847 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.870775 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5d4a8759-3f47-48af-89d3-b901ffdf09f9" (UID: "5d4a8759-3f47-48af-89d3-b901ffdf09f9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:12:20.870951 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.870929 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5d4a8759-3f47-48af-89d3-b901ffdf09f9" (UID: "5d4a8759-3f47-48af-89d3-b901ffdf09f9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:12:20.968746 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.968705 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-tls\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:12:20.968746 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.968738 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-installation-pull-secrets\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:12:20.968746 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.968753 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fk9bz\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-kube-api-access-fk9bz\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:12:20.969038 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.968766 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-registry-certificates\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:12:20.969038 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.968779 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d4a8759-3f47-48af-89d3-b901ffdf09f9-trusted-ca\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:12:20.969038 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.968792 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d4a8759-3f47-48af-89d3-b901ffdf09f9-image-registry-private-configuration\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:12:20.969038 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:20.968806 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d4a8759-3f47-48af-89d3-b901ffdf09f9-bound-sa-token\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:12:21.804834 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:21.804805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cdd5d6498-5qhzb" Apr 23 01:12:21.804834 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:21.804817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kz8x2" event={"ID":"e96552f1-4f6d-44b8-9597-3fe4faed4577","Type":"ContainerStarted","Data":"4e1f2de9c553c3019c0b592b5e722ac4fe96e1218250f7beb72b39a1a9983035"} Apr 23 01:12:21.805326 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:21.804857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kz8x2" event={"ID":"e96552f1-4f6d-44b8-9597-3fe4faed4577","Type":"ContainerStarted","Data":"64b784957f33f795471a8f8c8ab4198e14d394661fd7bcf35836bd9deb28263a"} Apr 23 01:12:21.805326 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:21.804870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kz8x2" event={"ID":"e96552f1-4f6d-44b8-9597-3fe4faed4577","Type":"ContainerStarted","Data":"7667259b208abd2a6eba671df00d8e8dc8026b98f28ef9a57537549a42bd0c35"} Apr 23 01:12:21.833417 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:21.833381 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-cdd5d6498-5qhzb"] Apr 23 01:12:21.836443 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:21.836418 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-cdd5d6498-5qhzb"] Apr 23 01:12:23.384126 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:23.384082 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4a8759-3f47-48af-89d3-b901ffdf09f9" path="/var/lib/kubelet/pods/5d4a8759-3f47-48af-89d3-b901ffdf09f9/volumes" Apr 23 01:12:23.813308 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:23.813275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kz8x2" event={"ID":"e96552f1-4f6d-44b8-9597-3fe4faed4577","Type":"ContainerStarted","Data":"a9d2036f7696ed258838b22e5fa635f4e9bda43083f6fee8a21d1e545b215305"} Apr 23 01:12:23.831354 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:23.831302 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kz8x2" podStartSLOduration=1.210032019 podStartE2EDuration="3.831286633s" podCreationTimestamp="2026-04-23 01:12:20 +0000 UTC" firstStartedPulling="2026-04-23 01:12:20.886181592 +0000 UTC m=+140.060977203" lastFinishedPulling="2026-04-23 01:12:23.507436204 +0000 UTC m=+142.682231817" observedRunningTime="2026-04-23 01:12:23.830513615 +0000 UTC m=+143.005309247" watchObservedRunningTime="2026-04-23 01:12:23.831286633 +0000 UTC m=+143.006082264" Apr 23 01:12:28.357238 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.357205 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d"] Apr 23 01:12:28.360539 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.360518 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.363831 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.363795 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 01:12:28.363831 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.363809 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 01:12:28.363831 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.363818 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 01:12:28.364088 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.363837 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-67mrb\"" Apr 23 01:12:28.364088 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.363862 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 01:12:28.364088 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.363940 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 01:12:28.375555 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.375529 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d"] Apr 23 01:12:28.376712 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.376690 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-55jcs"] Apr 23 01:12:28.380215 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.380197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.382257 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.382225 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 01:12:28.382257 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.382236 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 01:12:28.382672 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.382650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 01:12:28.382767 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.382712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-rwfq4\"" Apr 23 01:12:28.393529 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.392139 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lxvg2"] Apr 23 01:12:28.397205 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.397182 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-55jcs"] Apr 23 01:12:28.397335 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.397290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.399740 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.399717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 01:12:28.399856 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.399773 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-85prc\"" Apr 23 01:12:28.400052 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.400030 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 01:12:28.400143 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.400087 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 01:12:28.417148 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.417148 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.417351 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756dq\" (UniqueName: \"kubernetes.io/projected/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-kube-api-access-756dq\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.417351 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-sys\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.417351 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcpvr\" (UniqueName: \"kubernetes.io/projected/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-kube-api-access-kcpvr\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.417351 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.417351 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-wtmp\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.417494 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-accelerators-collector-config\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.417494 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03acfc21-c63d-4a37-b1b3-f487cc972611-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.417494 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03acfc21-c63d-4a37-b1b3-f487cc972611-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.417494 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.417605 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-metrics-client-ca\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.417605 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.417605 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.417605 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.417605 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-root\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.417745 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk4kb\" (UniqueName: \"kubernetes.io/projected/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-api-access-qk4kb\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.417745 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-textfile\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.417745 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.417641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-tls\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.518594 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpvr\" (UniqueName: \"kubernetes.io/projected/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-kube-api-access-kcpvr\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.518594 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-wtmp\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-accelerators-collector-config\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03acfc21-c63d-4a37-b1b3-f487cc972611-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03acfc21-c63d-4a37-b1b3-f487cc972611-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-metrics-client-ca\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.518844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-wtmp\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-root\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-root\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qk4kb\" (UniqueName: \"kubernetes.io/projected/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-api-access-qk4kb\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-textfile\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.518973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-tls\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.519047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.519087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.519132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-756dq\" (UniqueName: \"kubernetes.io/projected/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-kube-api-access-756dq\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.519180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-sys\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.519282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03acfc21-c63d-4a37-b1b3-f487cc972611-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.519894 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.519390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-sys\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519894 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.519531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-metrics-client-ca\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.519894 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.519614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.519894 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:28.519820 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 01:12:28.519894 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:28.519881 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls podName:0a34f6c3-dd01-4b57-8b91-d896ec90dd46 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:29.019863154 +0000 UTC m=+148.194658764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-xl42d" (UID: "0a34f6c3-dd01-4b57-8b91-d896ec90dd46") : secret "openshift-state-metrics-tls" not found Apr 23 01:12:28.520227 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.520203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.520285 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.520260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03acfc21-c63d-4a37-b1b3-f487cc972611-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.520375 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:28.520358 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 01:12:28.520455 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.520435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-textfile\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.520508 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:28.520455 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-tls podName:fb4cc752-0a5d-474a-a7a1-1d98c0428cd9 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:29.020435499 +0000 UTC m=+148.195231116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-tls") pod "node-exporter-lxvg2" (UID: "fb4cc752-0a5d-474a-a7a1-1d98c0428cd9") : secret "node-exporter-tls" not found Apr 23 01:12:28.520691 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.520672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-accelerators-collector-config\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.521714 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.521691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.521835 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.521723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.522064 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.522046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.522129 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.522117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.525786 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.525761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcpvr\" (UniqueName: \"kubernetes.io/projected/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-kube-api-access-kcpvr\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:28.529745 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.529723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk4kb\" (UniqueName: \"kubernetes.io/projected/03acfc21-c63d-4a37-b1b3-f487cc972611-kube-api-access-qk4kb\") pod \"kube-state-metrics-69db897b98-55jcs\" (UID: \"03acfc21-c63d-4a37-b1b3-f487cc972611\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.529879 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.529858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-756dq\" (UniqueName: \"kubernetes.io/projected/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-kube-api-access-756dq\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:28.690256 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.690156 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" Apr 23 01:12:28.810117 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.810078 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-55jcs"] Apr 23 01:12:28.814256 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:28.814225 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03acfc21_c63d_4a37_b1b3_f487cc972611.slice/crio-bbe4cb23240474264f22d2a3ed9481c2da3fc6d6e913f9febb393c8207cc6bb9 WatchSource:0}: Error finding container bbe4cb23240474264f22d2a3ed9481c2da3fc6d6e913f9febb393c8207cc6bb9: Status 404 returned error can't find the container with id bbe4cb23240474264f22d2a3ed9481c2da3fc6d6e913f9febb393c8207cc6bb9 Apr 23 01:12:28.827093 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:28.827064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" event={"ID":"03acfc21-c63d-4a37-b1b3-f487cc972611","Type":"ContainerStarted","Data":"bbe4cb23240474264f22d2a3ed9481c2da3fc6d6e913f9febb393c8207cc6bb9"} Apr 23 01:12:29.022894 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:29.022852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-tls\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:29.023090 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:29.022913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:29.023090 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:29.023019 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 01:12:29.023090 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:29.023071 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls podName:0a34f6c3-dd01-4b57-8b91-d896ec90dd46 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:30.023055278 +0000 UTC m=+149.197850893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-xl42d" (UID: "0a34f6c3-dd01-4b57-8b91-d896ec90dd46") : secret "openshift-state-metrics-tls" not found Apr 23 01:12:29.025206 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:29.025184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fb4cc752-0a5d-474a-a7a1-1d98c0428cd9-node-exporter-tls\") pod \"node-exporter-lxvg2\" (UID: \"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9\") " pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:29.306416 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:29.306333 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lxvg2" Apr 23 01:12:29.314457 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:29.314425 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb4cc752_0a5d_474a_a7a1_1d98c0428cd9.slice/crio-f76943bf087675bbc511feed27c8382b84e3ebc5cbd3a713c3546ec31d6908a2 WatchSource:0}: Error finding container f76943bf087675bbc511feed27c8382b84e3ebc5cbd3a713c3546ec31d6908a2: Status 404 returned error can't find the container with id f76943bf087675bbc511feed27c8382b84e3ebc5cbd3a713c3546ec31d6908a2 Apr 23 01:12:29.831306 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:29.831264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lxvg2" event={"ID":"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9","Type":"ContainerStarted","Data":"f76943bf087675bbc511feed27c8382b84e3ebc5cbd3a713c3546ec31d6908a2"} Apr 23 01:12:30.033030 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.032978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:30.035760 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.035729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a34f6c3-dd01-4b57-8b91-d896ec90dd46-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xl42d\" (UID: \"0a34f6c3-dd01-4b57-8b91-d896ec90dd46\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:30.170320 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.170242 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" Apr 23 01:12:30.439995 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.439919 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp"] Apr 23 01:12:30.444994 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.444951 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.448516 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.448245 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-cq1t1iclhcjms\"" Apr 23 01:12:30.448516 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.448373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-jnfw2\"" Apr 23 01:12:30.448746 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.448672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 01:12:30.448928 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.448911 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 01:12:30.450241 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.449304 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 01:12:30.450241 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.449537 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 01:12:30.452070 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.451800 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 01:12:30.453224 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.453198 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp"] Apr 23 01:12:30.517266 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.517231 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d"] Apr 23 01:12:30.517871 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:30.517836 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a34f6c3_dd01_4b57_8b91_d896ec90dd46.slice/crio-cbf735b78938a4a861b59054a9cfbdf9b5f609624b5d97e7247ae4033c12aabf WatchSource:0}: Error finding container cbf735b78938a4a861b59054a9cfbdf9b5f609624b5d97e7247ae4033c12aabf: Status 404 returned error can't find the container with id cbf735b78938a4a861b59054a9cfbdf9b5f609624b5d97e7247ae4033c12aabf Apr 23 01:12:30.538484 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.538454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.538594 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.538504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.538594 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.538555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-grpc-tls\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.538704 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.538598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxpn\" (UniqueName: \"kubernetes.io/projected/6110d3d3-583d-4603-9c3a-98306774315e-kube-api-access-kdxpn\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.538752 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.538683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.538800 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.538764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-tls\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.538852 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.538816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6110d3d3-583d-4603-9c3a-98306774315e-metrics-client-ca\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.538906 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.538867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.639606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6110d3d3-583d-4603-9c3a-98306774315e-metrics-client-ca\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.639692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.639730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.639769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.639834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-grpc-tls\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.639880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxpn\" (UniqueName: \"kubernetes.io/projected/6110d3d3-583d-4603-9c3a-98306774315e-kube-api-access-kdxpn\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.639931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.639969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-tls\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.641294 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.640834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6110d3d3-583d-4603-9c3a-98306774315e-metrics-client-ca\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.644249 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.644218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.645472 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.645387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-tls\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.645622 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.645531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.646232 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.646197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.646789 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.646763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-grpc-tls\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.647355 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.647313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6110d3d3-583d-4603-9c3a-98306774315e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.649083 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.649040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxpn\" (UniqueName: \"kubernetes.io/projected/6110d3d3-583d-4603-9c3a-98306774315e-kube-api-access-kdxpn\") pod \"thanos-querier-57bf8c9cc7-h64hp\" (UID: \"6110d3d3-583d-4603-9c3a-98306774315e\") " pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.760677 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.760646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:30.836222 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.836049 2576 generic.go:358] "Generic (PLEG): container finished" podID="fb4cc752-0a5d-474a-a7a1-1d98c0428cd9" containerID="898a4489dce18ee43d0d4f71b52a557e8f73e99e374b3fbdac0fda49c3a3f4fb" exitCode=0 Apr 23 01:12:30.836222 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.836169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lxvg2" event={"ID":"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9","Type":"ContainerDied","Data":"898a4489dce18ee43d0d4f71b52a557e8f73e99e374b3fbdac0fda49c3a3f4fb"} Apr 23 01:12:30.839784 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.839102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" event={"ID":"03acfc21-c63d-4a37-b1b3-f487cc972611","Type":"ContainerStarted","Data":"0aa0e9ba876108871c7f768ae879cd2f8b541ddbd8cc6ad0dcc70105a324502c"} Apr 23 01:12:30.839784 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.839145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" event={"ID":"03acfc21-c63d-4a37-b1b3-f487cc972611","Type":"ContainerStarted","Data":"56a3e211e107b42142d64fb8f0094f5926ccf89ec5a691e6392fac80455edcb6"} Apr 23 01:12:30.839784 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.839161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" event={"ID":"03acfc21-c63d-4a37-b1b3-f487cc972611","Type":"ContainerStarted","Data":"3693efd38432a2e85871ca2c3b6bbd7c22030159b364cfc28104d78592699cbb"} Apr 23 01:12:30.841380 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.841353 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" event={"ID":"0a34f6c3-dd01-4b57-8b91-d896ec90dd46","Type":"ContainerStarted","Data":"d83a70e837f967a9262f1a9cada4f51026c88659a0dfc5e3ade3997dcaba0a65"} Apr 23 01:12:30.841539 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.841387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" event={"ID":"0a34f6c3-dd01-4b57-8b91-d896ec90dd46","Type":"ContainerStarted","Data":"834544406baad5356c185e2cf82fe5dffe58ee0cc86c443c948368c0640fb9b9"} Apr 23 01:12:30.841539 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.841401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" event={"ID":"0a34f6c3-dd01-4b57-8b91-d896ec90dd46","Type":"ContainerStarted","Data":"cbf735b78938a4a861b59054a9cfbdf9b5f609624b5d97e7247ae4033c12aabf"} Apr 23 01:12:30.871973 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.871913 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-55jcs" podStartSLOduration=1.302674894 podStartE2EDuration="2.871891548s" podCreationTimestamp="2026-04-23 01:12:28 +0000 UTC" firstStartedPulling="2026-04-23 01:12:28.81607162 +0000 UTC m=+147.990867230" lastFinishedPulling="2026-04-23 01:12:30.385288275 +0000 UTC m=+149.560083884" observedRunningTime="2026-04-23 01:12:30.870179251 +0000 UTC m=+150.044974883" watchObservedRunningTime="2026-04-23 01:12:30.871891548 +0000 UTC m=+150.046687240" Apr 23 01:12:30.895542 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:30.895517 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp"] Apr 23 01:12:30.900224 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:30.900197 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6110d3d3_583d_4603_9c3a_98306774315e.slice/crio-e20dabeb69652a5cf7ee2376a97fb6328d9005300af0d6636aa190f2adc50e19 WatchSource:0}: Error finding container e20dabeb69652a5cf7ee2376a97fb6328d9005300af0d6636aa190f2adc50e19: Status 404 returned error can't find the container with id e20dabeb69652a5cf7ee2376a97fb6328d9005300af0d6636aa190f2adc50e19 Apr 23 01:12:31.850103 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:31.850069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lxvg2" event={"ID":"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9","Type":"ContainerStarted","Data":"fa90c58656f4f539a25fc1cb778fe32aee267056fbbd05dfc5c57f5f790ddb6a"} Apr 23 01:12:31.850430 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:31.850118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lxvg2" event={"ID":"fb4cc752-0a5d-474a-a7a1-1d98c0428cd9","Type":"ContainerStarted","Data":"b61201b2197c2e19b2b26a937b6b87d35bde1b126dc6d5e63364f1156da45f79"} Apr 23 01:12:31.851855 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:31.851826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" event={"ID":"6110d3d3-583d-4603-9c3a-98306774315e","Type":"ContainerStarted","Data":"e20dabeb69652a5cf7ee2376a97fb6328d9005300af0d6636aa190f2adc50e19"} Apr 23 01:12:31.866139 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:31.866056 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lxvg2" podStartSLOduration=2.7955358219999997 podStartE2EDuration="3.866040541s" podCreationTimestamp="2026-04-23 01:12:28 +0000 UTC" firstStartedPulling="2026-04-23 01:12:29.316254392 +0000 UTC m=+148.491050001" lastFinishedPulling="2026-04-23 01:12:30.386759095 +0000 UTC m=+149.561554720" observedRunningTime="2026-04-23 01:12:31.865677442 +0000 UTC m=+151.040473075" watchObservedRunningTime="2026-04-23 01:12:31.866040541 +0000 UTC m=+151.040836174" Apr 23 01:12:32.765586 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.765547 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-99b8b9c45-v9cx5"] Apr 23 01:12:32.769254 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.769233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.772516 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.772481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 01:12:32.772654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.772548 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 01:12:32.772654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.772562 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 01:12:32.772654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.772580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-f9mh0qlgbkeri\"" Apr 23 01:12:32.772654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.772587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-sddvs\"" Apr 23 01:12:32.772654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.772590 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 01:12:32.778926 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.778903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-99b8b9c45-v9cx5"] Apr 23 01:12:32.856178 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.856145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f682a177-73b0-458a-b735-eb6e0efa3d72-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.856657 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.856263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-secret-metrics-server-tls\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.856657 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.856308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-secret-metrics-server-client-certs\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.856657 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.856407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f682a177-73b0-458a-b735-eb6e0efa3d72-metrics-server-audit-profiles\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.856657 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.856442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f682a177-73b0-458a-b735-eb6e0efa3d72-audit-log\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.856657 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.856466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-client-ca-bundle\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.856657 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.856514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6cgs\" (UniqueName: \"kubernetes.io/projected/f682a177-73b0-458a-b735-eb6e0efa3d72-kube-api-access-q6cgs\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.856657 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.856591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" event={"ID":"0a34f6c3-dd01-4b57-8b91-d896ec90dd46","Type":"ContainerStarted","Data":"9fd5e3f614651bb95506c9779176defe63763af7c7777844f2e7f300e0fe1acd"} Apr 23 01:12:32.872851 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.872800 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xl42d" podStartSLOduration=3.826904374 podStartE2EDuration="4.872783155s" podCreationTimestamp="2026-04-23 01:12:28 +0000 UTC" firstStartedPulling="2026-04-23 01:12:30.740253775 +0000 UTC m=+149.915049384" lastFinishedPulling="2026-04-23 01:12:31.786132541 +0000 UTC m=+150.960928165" observedRunningTime="2026-04-23 01:12:32.871164924 +0000 UTC m=+152.045960548" watchObservedRunningTime="2026-04-23 01:12:32.872783155 +0000 UTC m=+152.047578787" Apr 23 01:12:32.957576 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.957531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-secret-metrics-server-tls\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.957772 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.957600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-secret-metrics-server-client-certs\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.958654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.957831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f682a177-73b0-458a-b735-eb6e0efa3d72-metrics-server-audit-profiles\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.958654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.957906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f682a177-73b0-458a-b735-eb6e0efa3d72-audit-log\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.958654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.957931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-client-ca-bundle\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.958654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.958027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6cgs\" (UniqueName: \"kubernetes.io/projected/f682a177-73b0-458a-b735-eb6e0efa3d72-kube-api-access-q6cgs\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.958654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.958134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f682a177-73b0-458a-b735-eb6e0efa3d72-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.958654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.958431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f682a177-73b0-458a-b735-eb6e0efa3d72-audit-log\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.959112 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.959042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f682a177-73b0-458a-b735-eb6e0efa3d72-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.959112 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.959089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f682a177-73b0-458a-b735-eb6e0efa3d72-metrics-server-audit-profiles\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.960699 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.960672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-secret-metrics-server-tls\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.960975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.960954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-client-ca-bundle\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.961189 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.961171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f682a177-73b0-458a-b735-eb6e0efa3d72-secret-metrics-server-client-certs\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:32.965902 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:32.965882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6cgs\" (UniqueName: \"kubernetes.io/projected/f682a177-73b0-458a-b735-eb6e0efa3d72-kube-api-access-q6cgs\") pod \"metrics-server-99b8b9c45-v9cx5\" (UID: \"f682a177-73b0-458a-b735-eb6e0efa3d72\") " pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:33.078915 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.078823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:33.146193 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.146157 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld"] Apr 23 01:12:33.151539 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.151511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" Apr 23 01:12:33.153632 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.153605 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 01:12:33.154571 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.154166 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-sdgmj\"" Apr 23 01:12:33.156137 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.156082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld"] Apr 23 01:12:33.261315 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.261282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2094d4c-7dd1-451a-9a12-cc12a7b7d147-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-c72ld\" (UID: \"e2094d4c-7dd1-451a-9a12-cc12a7b7d147\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" Apr 23 01:12:33.362199 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.362110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2094d4c-7dd1-451a-9a12-cc12a7b7d147-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-c72ld\" (UID: \"e2094d4c-7dd1-451a-9a12-cc12a7b7d147\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" Apr 23 01:12:33.365337 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.365308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e2094d4c-7dd1-451a-9a12-cc12a7b7d147-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-c72ld\" (UID: \"e2094d4c-7dd1-451a-9a12-cc12a7b7d147\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" Apr 23 01:12:33.464348 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.464231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" Apr 23 01:12:33.512824 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.512760 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-99b8b9c45-v9cx5"] Apr 23 01:12:33.515510 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:33.515474 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf682a177_73b0_458a_b735_eb6e0efa3d72.slice/crio-15c001f63543c8fad946ab56e1976dbf60d48e2578888dcdabe70758dcaae065 WatchSource:0}: Error finding container 15c001f63543c8fad946ab56e1976dbf60d48e2578888dcdabe70758dcaae065: Status 404 returned error can't find the container with id 15c001f63543c8fad946ab56e1976dbf60d48e2578888dcdabe70758dcaae065 Apr 23 01:12:33.625088 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.625055 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld"] Apr 23 01:12:33.628780 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:33.628749 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2094d4c_7dd1_451a_9a12_cc12a7b7d147.slice/crio-70363dc248d7fc257f0cd2654120abc61bda285f6fbbf1668d18d273656d3ce2 WatchSource:0}: Error finding container 70363dc248d7fc257f0cd2654120abc61bda285f6fbbf1668d18d273656d3ce2: Status 404 returned error can't find the container with id 70363dc248d7fc257f0cd2654120abc61bda285f6fbbf1668d18d273656d3ce2 Apr 23 01:12:33.860514 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.860477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" event={"ID":"f682a177-73b0-458a-b735-eb6e0efa3d72","Type":"ContainerStarted","Data":"15c001f63543c8fad946ab56e1976dbf60d48e2578888dcdabe70758dcaae065"} Apr 23 01:12:33.861462 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.861437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" event={"ID":"e2094d4c-7dd1-451a-9a12-cc12a7b7d147","Type":"ContainerStarted","Data":"70363dc248d7fc257f0cd2654120abc61bda285f6fbbf1668d18d273656d3ce2"} Apr 23 01:12:33.863085 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.863064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" event={"ID":"6110d3d3-583d-4603-9c3a-98306774315e","Type":"ContainerStarted","Data":"5f6a9434e9a778c943f471c4708de279c0d7f4d3d3745af49b1681f01e66f5a1"} Apr 23 01:12:33.863209 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.863089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" event={"ID":"6110d3d3-583d-4603-9c3a-98306774315e","Type":"ContainerStarted","Data":"ee47b08d6e8723eb95d1e3f3f7450f3fb3438857bddf3e0be85475294c65972e"} Apr 23 01:12:33.863209 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:33.863101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" event={"ID":"6110d3d3-583d-4603-9c3a-98306774315e","Type":"ContainerStarted","Data":"6dd8d5aadd99839838c99af785e7f722a466e9e6f7f25a0ebc025551423d53c9"} Apr 23 01:12:34.632976 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:34.632918 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" podUID="6f507ae4-dd6e-4916-a07e-ec5ea46ca270" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 01:12:35.869684 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.869640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" event={"ID":"f682a177-73b0-458a-b735-eb6e0efa3d72","Type":"ContainerStarted","Data":"e5224e9d931167937d937d0468f98f846ab637e7b0d1f22f9abad6a90b0a78c9"} Apr 23 01:12:35.870961 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.870933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" event={"ID":"e2094d4c-7dd1-451a-9a12-cc12a7b7d147","Type":"ContainerStarted","Data":"0020103cc5492a77585d83a9dc7a511bb1f602438a41ca67827aedf03e4cdbbc"} Apr 23 01:12:35.871113 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.871096 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" Apr 23 01:12:35.873871 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.873845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" event={"ID":"6110d3d3-583d-4603-9c3a-98306774315e","Type":"ContainerStarted","Data":"87fc2dfd1176d3e5249af4c239d1da4d4f634892f1dbb4cceda1266356026941"} Apr 23 01:12:35.873871 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.873871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" event={"ID":"6110d3d3-583d-4603-9c3a-98306774315e","Type":"ContainerStarted","Data":"96178438f71ec225fdf4852bde71ec7b450fd1e651538b847a31587787dff233"} Apr 23 01:12:35.874068 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.873880 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" event={"ID":"6110d3d3-583d-4603-9c3a-98306774315e","Type":"ContainerStarted","Data":"a190cec3ac29c5646864a805e4ae7443c31725338c4922def22f4eb55d51979a"} Apr 23 01:12:35.874068 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.874033 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:35.876310 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.876291 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" Apr 23 01:12:35.885359 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.885300 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" podStartSLOduration=2.13553325 podStartE2EDuration="3.885282525s" podCreationTimestamp="2026-04-23 01:12:32 +0000 UTC" firstStartedPulling="2026-04-23 01:12:33.520168192 +0000 UTC m=+152.694963817" lastFinishedPulling="2026-04-23 01:12:35.269917476 +0000 UTC m=+154.444713092" observedRunningTime="2026-04-23 01:12:35.883650114 +0000 UTC m=+155.058445746" watchObservedRunningTime="2026-04-23 01:12:35.885282525 +0000 UTC m=+155.060078159" Apr 23 01:12:35.896975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.896919 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-c72ld" podStartSLOduration=1.254925597 podStartE2EDuration="2.896902568s" podCreationTimestamp="2026-04-23 01:12:33 +0000 UTC" firstStartedPulling="2026-04-23 01:12:33.630551549 +0000 UTC m=+152.805347158" lastFinishedPulling="2026-04-23 01:12:35.272528516 +0000 UTC m=+154.447324129" observedRunningTime="2026-04-23 01:12:35.896070958 +0000 UTC m=+155.070866601" watchObservedRunningTime="2026-04-23 01:12:35.896902568 +0000 UTC m=+155.071698237" Apr 23 01:12:35.916115 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:35.916046 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" podStartSLOduration=1.548446371 podStartE2EDuration="5.916021904s" podCreationTimestamp="2026-04-23 01:12:30 +0000 UTC" firstStartedPulling="2026-04-23 01:12:30.902338613 +0000 UTC m=+150.077134226" lastFinishedPulling="2026-04-23 01:12:35.269914148 +0000 UTC m=+154.444709759" observedRunningTime="2026-04-23 01:12:35.914087038 +0000 UTC m=+155.088882672" watchObservedRunningTime="2026-04-23 01:12:35.916021904 +0000 UTC m=+155.090817538" Apr 23 01:12:38.213539 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:38.213478 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-kdm6b" podUID="ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d" Apr 23 01:12:38.225765 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:12:38.225722 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kgfvc" podUID="16cbb3a1-0ddd-4793-8d37-07bfa2a0568d" Apr 23 01:12:38.881530 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:38.881501 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:12:38.881683 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:38.881506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kdm6b" Apr 23 01:12:41.881913 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:41.881832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-57bf8c9cc7-h64hp" Apr 23 01:12:43.147973 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.147935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:12:43.147973 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.147978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:12:43.150411 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.150382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d-metrics-tls\") pod \"dns-default-kdm6b\" (UID: \"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d\") " pod="openshift-dns/dns-default-kdm6b" Apr 23 01:12:43.150508 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.150472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16cbb3a1-0ddd-4793-8d37-07bfa2a0568d-cert\") pod \"ingress-canary-kgfvc\" (UID: \"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d\") " pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:12:43.384053 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.384025 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4q4b\"" Apr 23 01:12:43.384735 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.384719 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jjv8h\"" Apr 23 01:12:43.392366 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.392341 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kgfvc" Apr 23 01:12:43.392474 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.392437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kdm6b" Apr 23 01:12:43.470308 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.468303 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54457478b5-wlhvn"] Apr 23 01:12:43.478022 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.477970 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.478841 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.478679 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54457478b5-wlhvn"] Apr 23 01:12:43.480654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.480540 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 01:12:43.481388 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.480921 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cm99s\"" Apr 23 01:12:43.481388 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.481041 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 01:12:43.481388 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.481110 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 01:12:43.481388 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.481301 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 01:12:43.481388 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.481380 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 01:12:43.482158 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.482023 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 01:12:43.482158 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.482057 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 01:12:43.489122 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.489095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 01:12:43.534032 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.534000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kdm6b"] Apr 23 01:12:43.538322 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:43.538294 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac6cb0d5_6b9b_4b9e_8aa0_141ff0a55a4d.slice/crio-8aa23d6f0bb5df6a11d389eef9530a5d795aa64a6822a42910c2759b925d5cc0 WatchSource:0}: Error finding container 8aa23d6f0bb5df6a11d389eef9530a5d795aa64a6822a42910c2759b925d5cc0: Status 404 returned error can't find the container with id 8aa23d6f0bb5df6a11d389eef9530a5d795aa64a6822a42910c2759b925d5cc0 Apr 23 01:12:43.550748 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.550715 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kgfvc"] Apr 23 01:12:43.551053 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.551031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-oauth-config\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.551194 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.551067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-oauth-serving-cert\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.551194 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.551103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-serving-cert\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.551194 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.551121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-config\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.551194 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.551169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-service-ca\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.551405 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.551239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-trusted-ca-bundle\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.551405 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.551261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgzft\" (UniqueName: \"kubernetes.io/projected/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-kube-api-access-rgzft\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.553884 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:43.553858 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16cbb3a1_0ddd_4793_8d37_07bfa2a0568d.slice/crio-d6b5671739894fb348214eab6a092213ff72017b447940aec88ce59306251adb WatchSource:0}: Error finding container d6b5671739894fb348214eab6a092213ff72017b447940aec88ce59306251adb: Status 404 returned error can't find the container with id d6b5671739894fb348214eab6a092213ff72017b447940aec88ce59306251adb Apr 23 01:12:43.652616 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.652580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-oauth-serving-cert\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.652800 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.652637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-serving-cert\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.652800 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.652666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-config\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.652800 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.652690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-service-ca\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.652800 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.652723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-trusted-ca-bundle\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.653022 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.652840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgzft\" (UniqueName: \"kubernetes.io/projected/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-kube-api-access-rgzft\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.653134 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.653112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-oauth-config\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.653457 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.653430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-oauth-serving-cert\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.653556 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.653477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-service-ca\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.653556 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.653489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-config\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.653856 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.653835 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-trusted-ca-bundle\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.655464 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.655416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-serving-cert\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.655464 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.655448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-oauth-config\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.659367 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.659347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgzft\" (UniqueName: \"kubernetes.io/projected/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-kube-api-access-rgzft\") pod \"console-54457478b5-wlhvn\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.790652 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.790614 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:43.897184 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.897143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kgfvc" event={"ID":"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d","Type":"ContainerStarted","Data":"d6b5671739894fb348214eab6a092213ff72017b447940aec88ce59306251adb"} Apr 23 01:12:43.898180 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.898153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kdm6b" event={"ID":"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d","Type":"ContainerStarted","Data":"8aa23d6f0bb5df6a11d389eef9530a5d795aa64a6822a42910c2759b925d5cc0"} Apr 23 01:12:43.918244 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:43.918160 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54457478b5-wlhvn"] Apr 23 01:12:43.921790 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:12:43.921756 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8581fe56_f00c_42f6_9f50_f4dd4255cb7f.slice/crio-de86f4748734b02307c63ce6005d0a72394f4a5829639ef555710e323dd40d50 WatchSource:0}: Error finding container de86f4748734b02307c63ce6005d0a72394f4a5829639ef555710e323dd40d50: Status 404 returned error can't find the container with id de86f4748734b02307c63ce6005d0a72394f4a5829639ef555710e323dd40d50 Apr 23 01:12:44.633949 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:44.633905 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" podUID="6f507ae4-dd6e-4916-a07e-ec5ea46ca270" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 01:12:44.903551 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:44.903476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54457478b5-wlhvn" event={"ID":"8581fe56-f00c-42f6-9f50-f4dd4255cb7f","Type":"ContainerStarted","Data":"de86f4748734b02307c63ce6005d0a72394f4a5829639ef555710e323dd40d50"} Apr 23 01:12:45.909165 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:45.909117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kgfvc" event={"ID":"16cbb3a1-0ddd-4793-8d37-07bfa2a0568d","Type":"ContainerStarted","Data":"3b1b06f859c4d34da6f7a33bd1ea505d9a6e2ea85ba1b644d415b9dd499a6c73"} Apr 23 01:12:45.911006 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:45.910959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kdm6b" event={"ID":"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d","Type":"ContainerStarted","Data":"e9a8b28b461d013738184c45c089af651cbf8aeb153182c4b8e913c75652a7e0"} Apr 23 01:12:45.911146 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:45.911013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kdm6b" event={"ID":"ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d","Type":"ContainerStarted","Data":"4c9f99490b96d0e24374bb5683d5bb7344c9ac0bda2aff1283ea4469c1aafeb0"} Apr 23 01:12:45.911146 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:45.911135 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kdm6b" Apr 23 01:12:45.922694 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:45.922636 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kgfvc" podStartSLOduration=128.996215864 podStartE2EDuration="2m10.922616668s" podCreationTimestamp="2026-04-23 01:10:35 +0000 UTC" firstStartedPulling="2026-04-23 01:12:43.555720193 +0000 UTC m=+162.730515803" lastFinishedPulling="2026-04-23 01:12:45.482120986 +0000 UTC m=+164.656916607" observedRunningTime="2026-04-23 01:12:45.921753113 +0000 UTC m=+165.096548746" watchObservedRunningTime="2026-04-23 01:12:45.922616668 +0000 UTC m=+165.097412301" Apr 23 01:12:45.936485 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:45.936362 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kdm6b" podStartSLOduration=128.998813894 podStartE2EDuration="2m10.936337436s" podCreationTimestamp="2026-04-23 01:10:35 +0000 UTC" firstStartedPulling="2026-04-23 01:12:43.54029314 +0000 UTC m=+162.715088750" lastFinishedPulling="2026-04-23 01:12:45.477816679 +0000 UTC m=+164.652612292" observedRunningTime="2026-04-23 01:12:45.93540076 +0000 UTC m=+165.110196398" watchObservedRunningTime="2026-04-23 01:12:45.936337436 +0000 UTC m=+165.111133069" Apr 23 01:12:46.915558 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:46.915514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54457478b5-wlhvn" event={"ID":"8581fe56-f00c-42f6-9f50-f4dd4255cb7f","Type":"ContainerStarted","Data":"31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0"} Apr 23 01:12:46.936517 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:46.936465 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54457478b5-wlhvn" podStartSLOduration=1.082670064 podStartE2EDuration="3.936447514s" podCreationTimestamp="2026-04-23 01:12:43 +0000 UTC" firstStartedPulling="2026-04-23 01:12:43.923891746 +0000 UTC m=+163.098687375" lastFinishedPulling="2026-04-23 01:12:46.777669198 +0000 UTC m=+165.952464825" observedRunningTime="2026-04-23 01:12:46.935471338 +0000 UTC m=+166.110266970" watchObservedRunningTime="2026-04-23 01:12:46.936447514 +0000 UTC m=+166.111243146" Apr 23 01:12:53.079734 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:53.079687 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:53.080373 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:53.079778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:12:53.790830 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:53.790792 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:53.790830 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:53.790840 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:12:53.792287 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:53.792261 2576 patch_prober.go:28] interesting pod/console-54457478b5-wlhvn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.17:8443/health\": dial tcp 10.133.0.17:8443: connect: connection refused" start-of-body= Apr 23 01:12:53.792406 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:53.792308 2576 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-54457478b5-wlhvn" podUID="8581fe56-f00c-42f6-9f50-f4dd4255cb7f" containerName="console" probeResult="failure" output="Get \"https://10.133.0.17:8443/health\": dial tcp 10.133.0.17:8443: connect: connection refused" Apr 23 01:12:54.632859 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:54.632822 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" podUID="6f507ae4-dd6e-4916-a07e-ec5ea46ca270" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 01:12:54.633222 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:54.632887 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" Apr 23 01:12:54.633502 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:54.633469 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"5e6a0d5665295384fa5f088c6fb3e674b7cf5f8d829cfadacd6c922b851d940d"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 01:12:54.633558 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:54.633542 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" podUID="6f507ae4-dd6e-4916-a07e-ec5ea46ca270" containerName="service-proxy" containerID="cri-o://5e6a0d5665295384fa5f088c6fb3e674b7cf5f8d829cfadacd6c922b851d940d" gracePeriod=30 Apr 23 01:12:54.939847 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:54.939763 2576 generic.go:358] "Generic (PLEG): container finished" podID="6f507ae4-dd6e-4916-a07e-ec5ea46ca270" containerID="5e6a0d5665295384fa5f088c6fb3e674b7cf5f8d829cfadacd6c922b851d940d" exitCode=2 Apr 23 01:12:54.939847 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:54.939822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" event={"ID":"6f507ae4-dd6e-4916-a07e-ec5ea46ca270","Type":"ContainerDied","Data":"5e6a0d5665295384fa5f088c6fb3e674b7cf5f8d829cfadacd6c922b851d940d"} Apr 23 01:12:54.940054 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:54.939851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56ccc7b55f-st6dj" event={"ID":"6f507ae4-dd6e-4916-a07e-ec5ea46ca270","Type":"ContainerStarted","Data":"a21eccd3eb6a13488bbca36a1fed3f0884be9ce8983a0e8a464f7fa797feaf71"} Apr 23 01:12:55.917641 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:12:55.917607 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kdm6b" Apr 23 01:13:03.795850 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:13:03.795813 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:13:03.799554 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:13:03.799534 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:13:13.084432 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:13:13.084401 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:13:13.088196 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:13:13.088171 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-99b8b9c45-v9cx5" Apr 23 01:13:59.333925 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:13:59.333893 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54457478b5-wlhvn"] Apr 23 01:14:24.353289 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.353201 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54457478b5-wlhvn" podUID="8581fe56-f00c-42f6-9f50-f4dd4255cb7f" containerName="console" containerID="cri-o://31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0" gracePeriod=15 Apr 23 01:14:24.591198 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.591174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54457478b5-wlhvn_8581fe56-f00c-42f6-9f50-f4dd4255cb7f/console/0.log" Apr 23 01:14:24.591350 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.591245 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:14:24.676651 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.676560 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-config\") pod \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " Apr 23 01:14:24.676651 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.676618 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgzft\" (UniqueName: \"kubernetes.io/projected/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-kube-api-access-rgzft\") pod \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " Apr 23 01:14:24.676651 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.676644 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-trusted-ca-bundle\") pod \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " Apr 23 01:14:24.676879 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.676819 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-oauth-config\") pod \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " Apr 23 01:14:24.676879 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.676870 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-service-ca\") pod \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " Apr 23 01:14:24.676952 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.676906 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-serving-cert\") pod \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " Apr 23 01:14:24.676952 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.676918 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-config" (OuterVolumeSpecName: "console-config") pod "8581fe56-f00c-42f6-9f50-f4dd4255cb7f" (UID: "8581fe56-f00c-42f6-9f50-f4dd4255cb7f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:14:24.677080 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.676958 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-oauth-serving-cert\") pod \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\" (UID: \"8581fe56-f00c-42f6-9f50-f4dd4255cb7f\") " Apr 23 01:14:24.677080 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.677046 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8581fe56-f00c-42f6-9f50-f4dd4255cb7f" (UID: "8581fe56-f00c-42f6-9f50-f4dd4255cb7f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:14:24.677364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.677339 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-service-ca" (OuterVolumeSpecName: "service-ca") pod "8581fe56-f00c-42f6-9f50-f4dd4255cb7f" (UID: "8581fe56-f00c-42f6-9f50-f4dd4255cb7f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:14:24.677364 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.677342 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-config\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:14:24.677517 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.677370 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8581fe56-f00c-42f6-9f50-f4dd4255cb7f" (UID: "8581fe56-f00c-42f6-9f50-f4dd4255cb7f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:14:24.677517 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.677388 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-trusted-ca-bundle\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:14:24.679114 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.679088 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8581fe56-f00c-42f6-9f50-f4dd4255cb7f" (UID: "8581fe56-f00c-42f6-9f50-f4dd4255cb7f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:14:24.679225 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.679118 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8581fe56-f00c-42f6-9f50-f4dd4255cb7f" (UID: "8581fe56-f00c-42f6-9f50-f4dd4255cb7f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:14:24.679225 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.679120 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-kube-api-access-rgzft" (OuterVolumeSpecName: "kube-api-access-rgzft") pod "8581fe56-f00c-42f6-9f50-f4dd4255cb7f" (UID: "8581fe56-f00c-42f6-9f50-f4dd4255cb7f"). InnerVolumeSpecName "kube-api-access-rgzft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:14:24.778121 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.778073 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-oauth-config\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:14:24.778121 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.778114 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-service-ca\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:14:24.778121 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.778125 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-console-serving-cert\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:14:24.778121 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.778135 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-oauth-serving-cert\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:14:24.778121 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:24.778144 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rgzft\" (UniqueName: \"kubernetes.io/projected/8581fe56-f00c-42f6-9f50-f4dd4255cb7f-kube-api-access-rgzft\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:14:25.191713 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.191687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54457478b5-wlhvn_8581fe56-f00c-42f6-9f50-f4dd4255cb7f/console/0.log" Apr 23 01:14:25.191883 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.191727 2576 generic.go:358] "Generic (PLEG): container finished" podID="8581fe56-f00c-42f6-9f50-f4dd4255cb7f" containerID="31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0" exitCode=2 Apr 23 01:14:25.191883 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.191760 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54457478b5-wlhvn" event={"ID":"8581fe56-f00c-42f6-9f50-f4dd4255cb7f","Type":"ContainerDied","Data":"31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0"} Apr 23 01:14:25.191883 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.191800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54457478b5-wlhvn" event={"ID":"8581fe56-f00c-42f6-9f50-f4dd4255cb7f","Type":"ContainerDied","Data":"de86f4748734b02307c63ce6005d0a72394f4a5829639ef555710e323dd40d50"} Apr 23 01:14:25.191883 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.191807 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54457478b5-wlhvn" Apr 23 01:14:25.191883 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.191818 2576 scope.go:117] "RemoveContainer" containerID="31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0" Apr 23 01:14:25.203817 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.203797 2576 scope.go:117] "RemoveContainer" containerID="31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0" Apr 23 01:14:25.204139 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:14:25.204118 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0\": container with ID starting with 31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0 not found: ID does not exist" containerID="31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0" Apr 23 01:14:25.204210 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.204147 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0"} err="failed to get container status \"31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0\": rpc error: code = NotFound desc = could not find container \"31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0\": container with ID starting with 31187c34176e093e69df0bb7649657d167e35a61d4874db081381afe7f9aedd0 not found: ID does not exist" Apr 23 01:14:25.214500 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.214475 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54457478b5-wlhvn"] Apr 23 01:14:25.217752 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.217728 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54457478b5-wlhvn"] Apr 23 01:14:25.384354 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:14:25.384321 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8581fe56-f00c-42f6-9f50-f4dd4255cb7f" path="/var/lib/kubelet/pods/8581fe56-f00c-42f6-9f50-f4dd4255cb7f/volumes" Apr 23 01:15:01.274570 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:01.274542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:15:01.275128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:01.274638 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:15:01.277509 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:01.277489 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 01:15:16.754189 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.754151 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp"] Apr 23 01:15:16.756427 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.754446 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8581fe56-f00c-42f6-9f50-f4dd4255cb7f" containerName="console" Apr 23 01:15:16.756427 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.754456 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8581fe56-f00c-42f6-9f50-f4dd4255cb7f" containerName="console" Apr 23 01:15:16.756427 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.754513 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8581fe56-f00c-42f6-9f50-f4dd4255cb7f" containerName="console" Apr 23 01:15:16.757192 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.757175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" Apr 23 01:15:16.759429 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.759405 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 23 01:15:16.759429 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.759422 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-fx7lj\"" Apr 23 01:15:16.759594 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.759423 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:15:16.767492 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.767458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp"] Apr 23 01:15:16.875775 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.875735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8nvn\" (UniqueName: \"kubernetes.io/projected/a7718294-cf04-4a03-906d-1d282f658bfa-kube-api-access-k8nvn\") pod \"cert-manager-operator-controller-manager-54b9655956-4d9jp\" (UID: \"a7718294-cf04-4a03-906d-1d282f658bfa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" Apr 23 01:15:16.875967 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.875800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7718294-cf04-4a03-906d-1d282f658bfa-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4d9jp\" (UID: \"a7718294-cf04-4a03-906d-1d282f658bfa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" Apr 23 01:15:16.976724 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.976681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8nvn\" (UniqueName: \"kubernetes.io/projected/a7718294-cf04-4a03-906d-1d282f658bfa-kube-api-access-k8nvn\") pod \"cert-manager-operator-controller-manager-54b9655956-4d9jp\" (UID: \"a7718294-cf04-4a03-906d-1d282f658bfa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" Apr 23 01:15:16.976909 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.976757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7718294-cf04-4a03-906d-1d282f658bfa-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4d9jp\" (UID: \"a7718294-cf04-4a03-906d-1d282f658bfa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" Apr 23 01:15:16.977216 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:16.977196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7718294-cf04-4a03-906d-1d282f658bfa-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4d9jp\" (UID: \"a7718294-cf04-4a03-906d-1d282f658bfa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" Apr 23 01:15:17.004052 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:17.004024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8nvn\" (UniqueName: \"kubernetes.io/projected/a7718294-cf04-4a03-906d-1d282f658bfa-kube-api-access-k8nvn\") pod \"cert-manager-operator-controller-manager-54b9655956-4d9jp\" (UID: \"a7718294-cf04-4a03-906d-1d282f658bfa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" Apr 23 01:15:17.066092 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:17.066008 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" Apr 23 01:15:17.192005 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:17.191956 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp"] Apr 23 01:15:17.195875 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:15:17.195845 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7718294_cf04_4a03_906d_1d282f658bfa.slice/crio-743456be2198c7422b43fe5eab46235b8d7d784845f226b4a0384ff60174da12 WatchSource:0}: Error finding container 743456be2198c7422b43fe5eab46235b8d7d784845f226b4a0384ff60174da12: Status 404 returned error can't find the container with id 743456be2198c7422b43fe5eab46235b8d7d784845f226b4a0384ff60174da12 Apr 23 01:15:17.198921 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:17.198899 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:15:17.343771 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:17.343685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" event={"ID":"a7718294-cf04-4a03-906d-1d282f658bfa","Type":"ContainerStarted","Data":"743456be2198c7422b43fe5eab46235b8d7d784845f226b4a0384ff60174da12"} Apr 23 01:15:20.356290 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:20.356250 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" event={"ID":"a7718294-cf04-4a03-906d-1d282f658bfa","Type":"ContainerStarted","Data":"1af0cc39ec8c8dfc16f3f9b9c7cc4353c45daf246af7d2d9e282083b01448c41"} Apr 23 01:15:20.375106 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:20.375014 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4d9jp" podStartSLOduration=1.8871812000000001 podStartE2EDuration="4.374974434s" podCreationTimestamp="2026-04-23 01:15:16 +0000 UTC" firstStartedPulling="2026-04-23 01:15:17.199045234 +0000 UTC m=+316.373840843" lastFinishedPulling="2026-04-23 01:15:19.686838467 +0000 UTC m=+318.861634077" observedRunningTime="2026-04-23 01:15:20.373210111 +0000 UTC m=+319.548005734" watchObservedRunningTime="2026-04-23 01:15:20.374974434 +0000 UTC m=+319.549770067" Apr 23 01:15:28.735857 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.735817 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-pg6tl"] Apr 23 01:15:28.739115 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.739095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" Apr 23 01:15:28.741988 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.741955 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 01:15:28.742099 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.741960 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 01:15:28.742099 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.742023 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-5hwhl\"" Apr 23 01:15:28.744397 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.744371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-pg6tl"] Apr 23 01:15:28.875295 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.875255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d2ebde-2a3d-4da9-b403-0ce83b04265b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-pg6tl\" (UID: \"03d2ebde-2a3d-4da9-b403-0ce83b04265b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" Apr 23 01:15:28.875480 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.875316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqlvf\" (UniqueName: \"kubernetes.io/projected/03d2ebde-2a3d-4da9-b403-0ce83b04265b-kube-api-access-cqlvf\") pod \"cert-manager-cainjector-68b757865b-pg6tl\" (UID: \"03d2ebde-2a3d-4da9-b403-0ce83b04265b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" Apr 23 01:15:28.976052 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.975998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d2ebde-2a3d-4da9-b403-0ce83b04265b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-pg6tl\" (UID: \"03d2ebde-2a3d-4da9-b403-0ce83b04265b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" Apr 23 01:15:28.976253 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.976094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqlvf\" (UniqueName: \"kubernetes.io/projected/03d2ebde-2a3d-4da9-b403-0ce83b04265b-kube-api-access-cqlvf\") pod \"cert-manager-cainjector-68b757865b-pg6tl\" (UID: \"03d2ebde-2a3d-4da9-b403-0ce83b04265b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" Apr 23 01:15:28.983853 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.983826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d2ebde-2a3d-4da9-b403-0ce83b04265b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-pg6tl\" (UID: \"03d2ebde-2a3d-4da9-b403-0ce83b04265b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" Apr 23 01:15:28.984061 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:28.984037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqlvf\" (UniqueName: \"kubernetes.io/projected/03d2ebde-2a3d-4da9-b403-0ce83b04265b-kube-api-access-cqlvf\") pod \"cert-manager-cainjector-68b757865b-pg6tl\" (UID: \"03d2ebde-2a3d-4da9-b403-0ce83b04265b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" Apr 23 01:15:29.049912 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:29.049802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" Apr 23 01:15:29.172836 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:29.172671 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-pg6tl"] Apr 23 01:15:29.175666 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:15:29.175634 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03d2ebde_2a3d_4da9_b403_0ce83b04265b.slice/crio-76f47bc8210f8a141749383702519fbf1b9237ad4d0706499c651f9d16ff0cf5 WatchSource:0}: Error finding container 76f47bc8210f8a141749383702519fbf1b9237ad4d0706499c651f9d16ff0cf5: Status 404 returned error can't find the container with id 76f47bc8210f8a141749383702519fbf1b9237ad4d0706499c651f9d16ff0cf5 Apr 23 01:15:29.384882 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:29.384800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" event={"ID":"03d2ebde-2a3d-4da9-b403-0ce83b04265b","Type":"ContainerStarted","Data":"76f47bc8210f8a141749383702519fbf1b9237ad4d0706499c651f9d16ff0cf5"} Apr 23 01:15:32.400256 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:32.400215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" event={"ID":"03d2ebde-2a3d-4da9-b403-0ce83b04265b","Type":"ContainerStarted","Data":"2fe37da0f0624d7f3c518848ed941d4558552c1035a5efc3b5c54453140f54b8"} Apr 23 01:15:32.419020 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:32.418946 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-pg6tl" podStartSLOduration=1.648489527 podStartE2EDuration="4.418930772s" podCreationTimestamp="2026-04-23 01:15:28 +0000 UTC" firstStartedPulling="2026-04-23 01:15:29.177530582 +0000 UTC m=+328.352326196" lastFinishedPulling="2026-04-23 01:15:31.947971828 +0000 UTC m=+331.122767441" observedRunningTime="2026-04-23 01:15:32.418138012 +0000 UTC m=+331.592933645" watchObservedRunningTime="2026-04-23 01:15:32.418930772 +0000 UTC m=+331.593726416" Apr 23 01:15:41.232004 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.231904 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-tmlz7"] Apr 23 01:15:41.235262 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.235242 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-tmlz7" Apr 23 01:15:41.237564 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.237540 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-pph5w\"" Apr 23 01:15:41.241752 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.241723 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-tmlz7"] Apr 23 01:15:41.273019 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.272963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a35d71c-b1ec-459c-a43a-f6873858eeba-bound-sa-token\") pod \"cert-manager-79c8d999ff-tmlz7\" (UID: \"6a35d71c-b1ec-459c-a43a-f6873858eeba\") " pod="cert-manager/cert-manager-79c8d999ff-tmlz7" Apr 23 01:15:41.273019 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.273020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svcm\" (UniqueName: \"kubernetes.io/projected/6a35d71c-b1ec-459c-a43a-f6873858eeba-kube-api-access-6svcm\") pod \"cert-manager-79c8d999ff-tmlz7\" (UID: \"6a35d71c-b1ec-459c-a43a-f6873858eeba\") " pod="cert-manager/cert-manager-79c8d999ff-tmlz7" Apr 23 01:15:41.373435 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.373403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a35d71c-b1ec-459c-a43a-f6873858eeba-bound-sa-token\") pod \"cert-manager-79c8d999ff-tmlz7\" (UID: \"6a35d71c-b1ec-459c-a43a-f6873858eeba\") " pod="cert-manager/cert-manager-79c8d999ff-tmlz7" Apr 23 01:15:41.373435 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.373440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svcm\" (UniqueName: \"kubernetes.io/projected/6a35d71c-b1ec-459c-a43a-f6873858eeba-kube-api-access-6svcm\") pod \"cert-manager-79c8d999ff-tmlz7\" (UID: \"6a35d71c-b1ec-459c-a43a-f6873858eeba\") " pod="cert-manager/cert-manager-79c8d999ff-tmlz7" Apr 23 01:15:41.381495 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.381461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a35d71c-b1ec-459c-a43a-f6873858eeba-bound-sa-token\") pod \"cert-manager-79c8d999ff-tmlz7\" (UID: \"6a35d71c-b1ec-459c-a43a-f6873858eeba\") " pod="cert-manager/cert-manager-79c8d999ff-tmlz7" Apr 23 01:15:41.381661 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.381622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svcm\" (UniqueName: \"kubernetes.io/projected/6a35d71c-b1ec-459c-a43a-f6873858eeba-kube-api-access-6svcm\") pod \"cert-manager-79c8d999ff-tmlz7\" (UID: \"6a35d71c-b1ec-459c-a43a-f6873858eeba\") " pod="cert-manager/cert-manager-79c8d999ff-tmlz7" Apr 23 01:15:41.545350 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.545254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-tmlz7" Apr 23 01:15:41.669251 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:41.669220 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-tmlz7"] Apr 23 01:15:41.672448 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:15:41.672421 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a35d71c_b1ec_459c_a43a_f6873858eeba.slice/crio-44c97fc2a8e953c6c2357d4b4fd89f0e3d4e3be03f97e878aebd5bfa7de5af5e WatchSource:0}: Error finding container 44c97fc2a8e953c6c2357d4b4fd89f0e3d4e3be03f97e878aebd5bfa7de5af5e: Status 404 returned error can't find the container with id 44c97fc2a8e953c6c2357d4b4fd89f0e3d4e3be03f97e878aebd5bfa7de5af5e Apr 23 01:15:42.430265 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:42.430231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-tmlz7" event={"ID":"6a35d71c-b1ec-459c-a43a-f6873858eeba","Type":"ContainerStarted","Data":"f5830c34ffe1af9145a3524651749ecc80992237652c46876344ee09af4ef5ba"} Apr 23 01:15:42.430265 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:42.430269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-tmlz7" event={"ID":"6a35d71c-b1ec-459c-a43a-f6873858eeba","Type":"ContainerStarted","Data":"44c97fc2a8e953c6c2357d4b4fd89f0e3d4e3be03f97e878aebd5bfa7de5af5e"} Apr 23 01:15:42.446821 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:42.446768 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-tmlz7" podStartSLOduration=1.446753695 podStartE2EDuration="1.446753695s" podCreationTimestamp="2026-04-23 01:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:15:42.445474112 +0000 UTC m=+341.620269744" watchObservedRunningTime="2026-04-23 01:15:42.446753695 +0000 UTC m=+341.621549327" Apr 23 01:15:53.681645 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.681610 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657"] Apr 23 01:15:53.688947 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.688921 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.693143 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.693114 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 23 01:15:53.693143 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.693132 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 23 01:15:53.693328 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.693262 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 23 01:15:53.693391 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.693369 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 23 01:15:53.693900 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.693880 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-hmfjw\"" Apr 23 01:15:53.708351 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.708322 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657"] Apr 23 01:15:53.765221 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.765188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/11222ec8-cf6a-464c-933d-8706b7de04c5-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.765380 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.765226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9dkc\" (UniqueName: \"kubernetes.io/projected/11222ec8-cf6a-464c-933d-8706b7de04c5-kube-api-access-x9dkc\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.765380 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.765312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/11222ec8-cf6a-464c-933d-8706b7de04c5-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.866036 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.865973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/11222ec8-cf6a-464c-933d-8706b7de04c5-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.866223 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.866045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9dkc\" (UniqueName: \"kubernetes.io/projected/11222ec8-cf6a-464c-933d-8706b7de04c5-kube-api-access-x9dkc\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.866223 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.866083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/11222ec8-cf6a-464c-933d-8706b7de04c5-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.868503 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.868470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/11222ec8-cf6a-464c-933d-8706b7de04c5-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.868622 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.868582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/11222ec8-cf6a-464c-933d-8706b7de04c5-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:53.880237 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.880214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9dkc\" (UniqueName: \"kubernetes.io/projected/11222ec8-cf6a-464c-933d-8706b7de04c5-kube-api-access-x9dkc\") pod \"opendatahub-operator-controller-manager-5fb5768b86-sk657\" (UID: \"11222ec8-cf6a-464c-933d-8706b7de04c5\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:54.000032 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:53.999974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:54.125761 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:54.125725 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657"] Apr 23 01:15:54.130017 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:15:54.129969 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11222ec8_cf6a_464c_933d_8706b7de04c5.slice/crio-c6745dcffcf36f1d9828b96d3a6086cb3c447f33a2fdb187596235c94ebcda6b WatchSource:0}: Error finding container c6745dcffcf36f1d9828b96d3a6086cb3c447f33a2fdb187596235c94ebcda6b: Status 404 returned error can't find the container with id c6745dcffcf36f1d9828b96d3a6086cb3c447f33a2fdb187596235c94ebcda6b Apr 23 01:15:54.470654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:54.470604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" event={"ID":"11222ec8-cf6a-464c-933d-8706b7de04c5","Type":"ContainerStarted","Data":"c6745dcffcf36f1d9828b96d3a6086cb3c447f33a2fdb187596235c94ebcda6b"} Apr 23 01:15:57.483896 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:57.483859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" event={"ID":"11222ec8-cf6a-464c-933d-8706b7de04c5","Type":"ContainerStarted","Data":"476cbd02ac9eec4d376688ad853eb1afd4cc24af8c9c54d36b9ee62cad2177b0"} Apr 23 01:15:57.484317 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:57.484028 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:15:57.505881 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:15:57.505830 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" podStartSLOduration=1.890061814 podStartE2EDuration="4.505814434s" podCreationTimestamp="2026-04-23 01:15:53 +0000 UTC" firstStartedPulling="2026-04-23 01:15:54.131783987 +0000 UTC m=+353.306579601" lastFinishedPulling="2026-04-23 01:15:56.747536609 +0000 UTC m=+355.922332221" observedRunningTime="2026-04-23 01:15:57.504143118 +0000 UTC m=+356.678938750" watchObservedRunningTime="2026-04-23 01:15:57.505814434 +0000 UTC m=+356.680610067" Apr 23 01:16:08.493930 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:08.493899 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-sk657" Apr 23 01:16:12.012221 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.012186 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-8596599875-qbwwc"] Apr 23 01:16:12.015246 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.015227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.018818 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.018794 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 01:16:12.018818 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.018794 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-nmrtz\"" Apr 23 01:16:12.019024 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.018820 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 23 01:16:12.019024 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.018824 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 01:16:12.019024 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.018796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 23 01:16:12.025361 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.025336 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-8596599875-qbwwc"] Apr 23 01:16:12.123641 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.123611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-tmp\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.123830 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.123649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dp8b\" (UniqueName: \"kubernetes.io/projected/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-kube-api-access-2dp8b\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.123830 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.123727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-tls-certs\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.225161 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.225110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-tmp\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.225161 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.225169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dp8b\" (UniqueName: \"kubernetes.io/projected/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-kube-api-access-2dp8b\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.225439 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.225218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-tls-certs\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.227535 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.227511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-tmp\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.227743 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.227724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-tls-certs\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.232438 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.232412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dp8b\" (UniqueName: \"kubernetes.io/projected/e3a1be14-d458-4b2d-ab99-162e9f69c4ad-kube-api-access-2dp8b\") pod \"kube-auth-proxy-8596599875-qbwwc\" (UID: \"e3a1be14-d458-4b2d-ab99-162e9f69c4ad\") " pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.326518 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.326423 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" Apr 23 01:16:12.448975 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.448935 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-8596599875-qbwwc"] Apr 23 01:16:12.452315 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:16:12.452279 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a1be14_d458_4b2d_ab99_162e9f69c4ad.slice/crio-4e991d85b7fea64d7ad1bdd5ed1b5d3ce63f9ae4919d36b2d2e3a2bee74ec169 WatchSource:0}: Error finding container 4e991d85b7fea64d7ad1bdd5ed1b5d3ce63f9ae4919d36b2d2e3a2bee74ec169: Status 404 returned error can't find the container with id 4e991d85b7fea64d7ad1bdd5ed1b5d3ce63f9ae4919d36b2d2e3a2bee74ec169 Apr 23 01:16:12.539292 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:12.539254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" event={"ID":"e3a1be14-d458-4b2d-ab99-162e9f69c4ad","Type":"ContainerStarted","Data":"4e991d85b7fea64d7ad1bdd5ed1b5d3ce63f9ae4919d36b2d2e3a2bee74ec169"} Apr 23 01:16:14.916485 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:14.916442 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-pmk8s"] Apr 23 01:16:14.919924 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:14.919901 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:14.922317 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:14.922294 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 23 01:16:14.922317 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:14.922306 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-fbsj9\"" Apr 23 01:16:14.931278 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:14.931252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-pmk8s"] Apr 23 01:16:15.048296 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:15.048262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert\") pod \"odh-model-controller-858dbf95b8-pmk8s\" (UID: \"86503cb2-857e-4924-af95-6f47288ea75d\") " pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:15.048463 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:15.048342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4m2\" (UniqueName: \"kubernetes.io/projected/86503cb2-857e-4924-af95-6f47288ea75d-kube-api-access-cw4m2\") pod \"odh-model-controller-858dbf95b8-pmk8s\" (UID: \"86503cb2-857e-4924-af95-6f47288ea75d\") " pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:15.148945 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:15.148906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4m2\" (UniqueName: \"kubernetes.io/projected/86503cb2-857e-4924-af95-6f47288ea75d-kube-api-access-cw4m2\") pod \"odh-model-controller-858dbf95b8-pmk8s\" (UID: \"86503cb2-857e-4924-af95-6f47288ea75d\") " pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:15.149138 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:15.148958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert\") pod \"odh-model-controller-858dbf95b8-pmk8s\" (UID: \"86503cb2-857e-4924-af95-6f47288ea75d\") " pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:15.149138 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:16:15.149096 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 01:16:15.149214 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:16:15.149186 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert podName:86503cb2-857e-4924-af95-6f47288ea75d nodeName:}" failed. No retries permitted until 2026-04-23 01:16:15.649164214 +0000 UTC m=+374.823959824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert") pod "odh-model-controller-858dbf95b8-pmk8s" (UID: "86503cb2-857e-4924-af95-6f47288ea75d") : secret "odh-model-controller-webhook-cert" not found Apr 23 01:16:15.157544 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:15.157520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4m2\" (UniqueName: \"kubernetes.io/projected/86503cb2-857e-4924-af95-6f47288ea75d-kube-api-access-cw4m2\") pod \"odh-model-controller-858dbf95b8-pmk8s\" (UID: \"86503cb2-857e-4924-af95-6f47288ea75d\") " pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:15.652711 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:15.652669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert\") pod \"odh-model-controller-858dbf95b8-pmk8s\" (UID: \"86503cb2-857e-4924-af95-6f47288ea75d\") " pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:15.652893 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:16:15.652834 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 01:16:15.652944 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:16:15.652929 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert podName:86503cb2-857e-4924-af95-6f47288ea75d nodeName:}" failed. No retries permitted until 2026-04-23 01:16:16.652908495 +0000 UTC m=+375.827704105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert") pod "odh-model-controller-858dbf95b8-pmk8s" (UID: "86503cb2-857e-4924-af95-6f47288ea75d") : secret "odh-model-controller-webhook-cert" not found Apr 23 01:16:16.552661 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:16.552623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" event={"ID":"e3a1be14-d458-4b2d-ab99-162e9f69c4ad","Type":"ContainerStarted","Data":"012a617b29faed0dce8a45ec05134c7286ecd5739d3d4f077a6672edb87095f9"} Apr 23 01:16:16.569220 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:16.569122 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-8596599875-qbwwc" podStartSLOduration=1.71168466 podStartE2EDuration="5.569105957s" podCreationTimestamp="2026-04-23 01:16:11 +0000 UTC" firstStartedPulling="2026-04-23 01:16:12.454006859 +0000 UTC m=+371.628802469" lastFinishedPulling="2026-04-23 01:16:16.311428153 +0000 UTC m=+375.486223766" observedRunningTime="2026-04-23 01:16:16.566949236 +0000 UTC m=+375.741744868" watchObservedRunningTime="2026-04-23 01:16:16.569105957 +0000 UTC m=+375.743901588" Apr 23 01:16:16.662789 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:16.662743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert\") pod \"odh-model-controller-858dbf95b8-pmk8s\" (UID: \"86503cb2-857e-4924-af95-6f47288ea75d\") " pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:16.665161 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:16.665142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86503cb2-857e-4924-af95-6f47288ea75d-cert\") pod \"odh-model-controller-858dbf95b8-pmk8s\" (UID: \"86503cb2-857e-4924-af95-6f47288ea75d\") " pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:16.731586 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:16.731546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:16.873491 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:16.873465 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-pmk8s"] Apr 23 01:16:16.875673 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:16:16.875649 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86503cb2_857e_4924_af95_6f47288ea75d.slice/crio-3bfd5bb11aba6225c9bf38d138bab28ec4934fbec6f64fe7ad46cae6306b925b WatchSource:0}: Error finding container 3bfd5bb11aba6225c9bf38d138bab28ec4934fbec6f64fe7ad46cae6306b925b: Status 404 returned error can't find the container with id 3bfd5bb11aba6225c9bf38d138bab28ec4934fbec6f64fe7ad46cae6306b925b Apr 23 01:16:17.557783 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:17.557743 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" event={"ID":"86503cb2-857e-4924-af95-6f47288ea75d","Type":"ContainerStarted","Data":"3bfd5bb11aba6225c9bf38d138bab28ec4934fbec6f64fe7ad46cae6306b925b"} Apr 23 01:16:20.521049 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.521007 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ffwjb"] Apr 23 01:16:20.524236 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.524217 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:20.526536 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.526505 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 23 01:16:20.526671 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.526624 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-wbf85\"" Apr 23 01:16:20.532565 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.532531 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ffwjb"] Apr 23 01:16:20.570509 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.570474 2576 generic.go:358] "Generic (PLEG): container finished" podID="86503cb2-857e-4924-af95-6f47288ea75d" containerID="d479500d14625a8f8b55eb84205930ff915ac21163f983fb53762ad1fcafcb72" exitCode=1 Apr 23 01:16:20.570712 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.570572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" event={"ID":"86503cb2-857e-4924-af95-6f47288ea75d","Type":"ContainerDied","Data":"d479500d14625a8f8b55eb84205930ff915ac21163f983fb53762ad1fcafcb72"} Apr 23 01:16:20.570854 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.570838 2576 scope.go:117] "RemoveContainer" containerID="d479500d14625a8f8b55eb84205930ff915ac21163f983fb53762ad1fcafcb72" Apr 23 01:16:20.601166 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.601124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgn2x\" (UniqueName: \"kubernetes.io/projected/601a9f2f-90f1-4817-93d6-7215a72f670e-kube-api-access-rgn2x\") pod \"kserve-controller-manager-856948b99f-ffwjb\" (UID: \"601a9f2f-90f1-4817-93d6-7215a72f670e\") " pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:20.601166 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.601171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/601a9f2f-90f1-4817-93d6-7215a72f670e-cert\") pod \"kserve-controller-manager-856948b99f-ffwjb\" (UID: \"601a9f2f-90f1-4817-93d6-7215a72f670e\") " pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:20.702153 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.702103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/601a9f2f-90f1-4817-93d6-7215a72f670e-cert\") pod \"kserve-controller-manager-856948b99f-ffwjb\" (UID: \"601a9f2f-90f1-4817-93d6-7215a72f670e\") " pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:20.702368 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.702253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgn2x\" (UniqueName: \"kubernetes.io/projected/601a9f2f-90f1-4817-93d6-7215a72f670e-kube-api-access-rgn2x\") pod \"kserve-controller-manager-856948b99f-ffwjb\" (UID: \"601a9f2f-90f1-4817-93d6-7215a72f670e\") " pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:20.702368 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:16:20.702295 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 23 01:16:20.702490 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:16:20.702374 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/601a9f2f-90f1-4817-93d6-7215a72f670e-cert podName:601a9f2f-90f1-4817-93d6-7215a72f670e nodeName:}" failed. No retries permitted until 2026-04-23 01:16:21.202350186 +0000 UTC m=+380.377145802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/601a9f2f-90f1-4817-93d6-7215a72f670e-cert") pod "kserve-controller-manager-856948b99f-ffwjb" (UID: "601a9f2f-90f1-4817-93d6-7215a72f670e") : secret "kserve-webhook-server-cert" not found Apr 23 01:16:20.713204 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:20.713171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgn2x\" (UniqueName: \"kubernetes.io/projected/601a9f2f-90f1-4817-93d6-7215a72f670e-kube-api-access-rgn2x\") pod \"kserve-controller-manager-856948b99f-ffwjb\" (UID: \"601a9f2f-90f1-4817-93d6-7215a72f670e\") " pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:21.207034 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:21.206965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/601a9f2f-90f1-4817-93d6-7215a72f670e-cert\") pod \"kserve-controller-manager-856948b99f-ffwjb\" (UID: \"601a9f2f-90f1-4817-93d6-7215a72f670e\") " pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:21.209407 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:21.209382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/601a9f2f-90f1-4817-93d6-7215a72f670e-cert\") pod \"kserve-controller-manager-856948b99f-ffwjb\" (UID: \"601a9f2f-90f1-4817-93d6-7215a72f670e\") " pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:21.436600 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:21.436560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:21.573017 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:21.572942 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-ffwjb"] Apr 23 01:16:21.576781 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:16:21.576734 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod601a9f2f_90f1_4817_93d6_7215a72f670e.slice/crio-abc87c63123b156224b0bcd48952e1188a1ad55f781c35ed2835d4493f264711 WatchSource:0}: Error finding container abc87c63123b156224b0bcd48952e1188a1ad55f781c35ed2835d4493f264711: Status 404 returned error can't find the container with id abc87c63123b156224b0bcd48952e1188a1ad55f781c35ed2835d4493f264711 Apr 23 01:16:21.577041 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:21.577011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" event={"ID":"86503cb2-857e-4924-af95-6f47288ea75d","Type":"ContainerStarted","Data":"aa851deeaa85a413b94cc9ace9db6ca73783b5738c6769bd4fcf6b3663a0397a"} Apr 23 01:16:21.577164 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:21.577148 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:21.593487 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:21.593430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" podStartSLOduration=3.5963392819999997 podStartE2EDuration="7.593406982s" podCreationTimestamp="2026-04-23 01:16:14 +0000 UTC" firstStartedPulling="2026-04-23 01:16:16.877133773 +0000 UTC m=+376.051929382" lastFinishedPulling="2026-04-23 01:16:20.874201472 +0000 UTC m=+380.048997082" observedRunningTime="2026-04-23 01:16:21.592519684 +0000 UTC m=+380.767315316" watchObservedRunningTime="2026-04-23 01:16:21.593406982 +0000 UTC m=+380.768202627" Apr 23 01:16:22.581689 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:22.581651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" event={"ID":"601a9f2f-90f1-4817-93d6-7215a72f670e","Type":"ContainerStarted","Data":"abc87c63123b156224b0bcd48952e1188a1ad55f781c35ed2835d4493f264711"} Apr 23 01:16:24.589488 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:24.589396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" event={"ID":"601a9f2f-90f1-4817-93d6-7215a72f670e","Type":"ContainerStarted","Data":"c78054494558c77801bb88efbaa77115fc8a32ecc5b0ad6610fdbc80c65cf2eb"} Apr 23 01:16:24.589874 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:24.589536 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:16:24.612089 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:24.612025 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" podStartSLOduration=1.9614625970000001 podStartE2EDuration="4.611999389s" podCreationTimestamp="2026-04-23 01:16:20 +0000 UTC" firstStartedPulling="2026-04-23 01:16:21.578389748 +0000 UTC m=+380.753185360" lastFinishedPulling="2026-04-23 01:16:24.228926531 +0000 UTC m=+383.403722152" observedRunningTime="2026-04-23 01:16:24.610500746 +0000 UTC m=+383.785296378" watchObservedRunningTime="2026-04-23 01:16:24.611999389 +0000 UTC m=+383.786795017" Apr 23 01:16:25.855745 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:25.855705 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p"] Apr 23 01:16:25.859010 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:25.858978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:25.861501 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:25.861477 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 23 01:16:25.861814 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:25.861792 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 23 01:16:25.861910 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:25.861890 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-5jpp9\"" Apr 23 01:16:25.870460 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:25.870432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p"] Apr 23 01:16:25.946202 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:25.946167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/06dd7460-5595-4289-ab0b-9551d9e1d0d2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-f7n8p\" (UID: \"06dd7460-5595-4289-ab0b-9551d9e1d0d2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:25.946202 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:25.946206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzhdv\" (UniqueName: \"kubernetes.io/projected/06dd7460-5595-4289-ab0b-9551d9e1d0d2-kube-api-access-dzhdv\") pod \"servicemesh-operator3-55f49c5f94-f7n8p\" (UID: \"06dd7460-5595-4289-ab0b-9551d9e1d0d2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:26.046818 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:26.046783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzhdv\" (UniqueName: \"kubernetes.io/projected/06dd7460-5595-4289-ab0b-9551d9e1d0d2-kube-api-access-dzhdv\") pod \"servicemesh-operator3-55f49c5f94-f7n8p\" (UID: \"06dd7460-5595-4289-ab0b-9551d9e1d0d2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:26.046961 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:26.046899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/06dd7460-5595-4289-ab0b-9551d9e1d0d2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-f7n8p\" (UID: \"06dd7460-5595-4289-ab0b-9551d9e1d0d2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:26.049451 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:26.049417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/06dd7460-5595-4289-ab0b-9551d9e1d0d2-operator-config\") pod \"servicemesh-operator3-55f49c5f94-f7n8p\" (UID: \"06dd7460-5595-4289-ab0b-9551d9e1d0d2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:26.055770 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:26.055746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzhdv\" (UniqueName: \"kubernetes.io/projected/06dd7460-5595-4289-ab0b-9551d9e1d0d2-kube-api-access-dzhdv\") pod \"servicemesh-operator3-55f49c5f94-f7n8p\" (UID: \"06dd7460-5595-4289-ab0b-9551d9e1d0d2\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:26.168689 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:26.168589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:26.300823 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:26.300797 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p"] Apr 23 01:16:26.303648 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:16:26.303615 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06dd7460_5595_4289_ab0b_9551d9e1d0d2.slice/crio-0d77528ce0aeddb520ef29efc9f97c3cc90150ecb57134fc585d9837d21a246c WatchSource:0}: Error finding container 0d77528ce0aeddb520ef29efc9f97c3cc90150ecb57134fc585d9837d21a246c: Status 404 returned error can't find the container with id 0d77528ce0aeddb520ef29efc9f97c3cc90150ecb57134fc585d9837d21a246c Apr 23 01:16:26.596708 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:26.596668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" event={"ID":"06dd7460-5595-4289-ab0b-9551d9e1d0d2","Type":"ContainerStarted","Data":"0d77528ce0aeddb520ef29efc9f97c3cc90150ecb57134fc585d9837d21a246c"} Apr 23 01:16:29.609459 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:29.609371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" event={"ID":"06dd7460-5595-4289-ab0b-9551d9e1d0d2","Type":"ContainerStarted","Data":"ace8cc15dc7f788c68e72a56cd1edf06094021556de592d7cd980230cf89f62f"} Apr 23 01:16:29.609825 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:29.609509 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:29.629446 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:29.629396 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" podStartSLOduration=1.711673931 podStartE2EDuration="4.629380761s" podCreationTimestamp="2026-04-23 01:16:25 +0000 UTC" firstStartedPulling="2026-04-23 01:16:26.306383233 +0000 UTC m=+385.481178844" lastFinishedPulling="2026-04-23 01:16:29.224090065 +0000 UTC m=+388.398885674" observedRunningTime="2026-04-23 01:16:29.627186888 +0000 UTC m=+388.801982501" watchObservedRunningTime="2026-04-23 01:16:29.629380761 +0000 UTC m=+388.804176393" Apr 23 01:16:32.584671 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:32.584636 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-pmk8s" Apr 23 01:16:37.619266 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.619223 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp"] Apr 23 01:16:37.625311 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.625283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.627450 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.627424 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 23 01:16:37.627604 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.627585 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 23 01:16:37.627677 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.627618 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 23 01:16:37.627677 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.627637 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 23 01:16:37.627946 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.627925 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-5tktg\"" Apr 23 01:16:37.634478 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.634446 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp"] Apr 23 01:16:37.755161 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.755122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.755336 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.755165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dt6\" (UniqueName: \"kubernetes.io/projected/2612f981-16aa-4f30-9b65-9ccad11b8949-kube-api-access-g6dt6\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.755336 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.755200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/2612f981-16aa-4f30-9b65-9ccad11b8949-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.755336 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.755248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.755336 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.755265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.755336 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.755333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.755494 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.755371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.856414 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.856377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.856414 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.856414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dt6\" (UniqueName: \"kubernetes.io/projected/2612f981-16aa-4f30-9b65-9ccad11b8949-kube-api-access-g6dt6\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.856654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.856434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/2612f981-16aa-4f30-9b65-9ccad11b8949-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.856654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.856457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.856654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.856476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.856654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.856549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.856654 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.856575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.857523 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.857490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.858930 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.858898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/2612f981-16aa-4f30-9b65-9ccad11b8949-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.859098 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.859073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.859289 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.859273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.859328 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.859310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.864599 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.864574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2612f981-16aa-4f30-9b65-9ccad11b8949-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.864599 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.864596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dt6\" (UniqueName: \"kubernetes.io/projected/2612f981-16aa-4f30-9b65-9ccad11b8949-kube-api-access-g6dt6\") pod \"istiod-openshift-gateway-55ff986f96-fjhtp\" (UID: \"2612f981-16aa-4f30-9b65-9ccad11b8949\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:37.936569 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:37.936473 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:38.068064 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:38.067875 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp"] Apr 23 01:16:38.070282 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:16:38.070252 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2612f981_16aa_4f30_9b65_9ccad11b8949.slice/crio-0962672fb39bd9d8260f28e8111bbbe41788a55b2c9bbd5d8c37df793d9cd51b WatchSource:0}: Error finding container 0962672fb39bd9d8260f28e8111bbbe41788a55b2c9bbd5d8c37df793d9cd51b: Status 404 returned error can't find the container with id 0962672fb39bd9d8260f28e8111bbbe41788a55b2c9bbd5d8c37df793d9cd51b Apr 23 01:16:38.643703 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:38.643662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" event={"ID":"2612f981-16aa-4f30-9b65-9ccad11b8949","Type":"ContainerStarted","Data":"0962672fb39bd9d8260f28e8111bbbe41788a55b2c9bbd5d8c37df793d9cd51b"} Apr 23 01:16:40.615803 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:40.615772 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7n8p" Apr 23 01:16:40.688125 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:40.688070 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 23 01:16:40.688244 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:40.688159 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 23 01:16:41.659100 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:41.659050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" event={"ID":"2612f981-16aa-4f30-9b65-9ccad11b8949","Type":"ContainerStarted","Data":"bd05165fd3a38539371026bb5d79ff5ccf1aa4a844c8fd4970b2e7d0632479e7"} Apr 23 01:16:41.659575 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:41.659255 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:41.660903 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:41.660878 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-fjhtp container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 23 01:16:41.661062 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:41.660928 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" podUID="2612f981-16aa-4f30-9b65-9ccad11b8949" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 01:16:41.687293 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:41.687232 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" podStartSLOduration=2.071683617 podStartE2EDuration="4.687214361s" podCreationTimestamp="2026-04-23 01:16:37 +0000 UTC" firstStartedPulling="2026-04-23 01:16:38.072242268 +0000 UTC m=+397.247037878" lastFinishedPulling="2026-04-23 01:16:40.687772998 +0000 UTC m=+399.862568622" observedRunningTime="2026-04-23 01:16:41.686430727 +0000 UTC m=+400.861226361" watchObservedRunningTime="2026-04-23 01:16:41.687214361 +0000 UTC m=+400.862009996" Apr 23 01:16:42.663659 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:42.663625 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjhtp" Apr 23 01:16:55.597614 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:16:55.597573 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-ffwjb" Apr 23 01:17:46.925450 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:46.925412 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv"] Apr 23 01:17:46.928737 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:46.928720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:17:46.931168 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:46.931145 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 01:17:46.931289 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:46.931184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 01:17:46.931925 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:46.931908 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-jqf9s\"" Apr 23 01:17:46.938371 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:46.938346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv"] Apr 23 01:17:47.052515 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:47.052479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcbkm\" (UniqueName: \"kubernetes.io/projected/d20f6dd9-77af-4551-b287-58486f3b54b8-kube-api-access-fcbkm\") pod \"limitador-operator-controller-manager-85c4996f8c-zdjmv\" (UID: \"d20f6dd9-77af-4551-b287-58486f3b54b8\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:17:47.153367 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:47.153317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcbkm\" (UniqueName: \"kubernetes.io/projected/d20f6dd9-77af-4551-b287-58486f3b54b8-kube-api-access-fcbkm\") pod \"limitador-operator-controller-manager-85c4996f8c-zdjmv\" (UID: \"d20f6dd9-77af-4551-b287-58486f3b54b8\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:17:47.169843 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:47.169800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcbkm\" (UniqueName: \"kubernetes.io/projected/d20f6dd9-77af-4551-b287-58486f3b54b8-kube-api-access-fcbkm\") pod \"limitador-operator-controller-manager-85c4996f8c-zdjmv\" (UID: \"d20f6dd9-77af-4551-b287-58486f3b54b8\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:17:47.240852 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:47.240816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:17:47.369091 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:47.369060 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv"] Apr 23 01:17:47.371535 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:17:47.371509 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd20f6dd9_77af_4551_b287_58486f3b54b8.slice/crio-ee1c6136629eb96a1d9a4c6e1669b6f91fa3d79ead46dc5f23dbe8934020b126 WatchSource:0}: Error finding container ee1c6136629eb96a1d9a4c6e1669b6f91fa3d79ead46dc5f23dbe8934020b126: Status 404 returned error can't find the container with id ee1c6136629eb96a1d9a4c6e1669b6f91fa3d79ead46dc5f23dbe8934020b126 Apr 23 01:17:47.887743 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:47.887703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" event={"ID":"d20f6dd9-77af-4551-b287-58486f3b54b8","Type":"ContainerStarted","Data":"ee1c6136629eb96a1d9a4c6e1669b6f91fa3d79ead46dc5f23dbe8934020b126"} Apr 23 01:17:49.896763 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:49.896669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" event={"ID":"d20f6dd9-77af-4551-b287-58486f3b54b8","Type":"ContainerStarted","Data":"f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d"} Apr 23 01:17:49.897172 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:49.896815 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:17:49.911448 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:17:49.911388 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" podStartSLOduration=1.7180791530000001 podStartE2EDuration="3.911370151s" podCreationTimestamp="2026-04-23 01:17:46 +0000 UTC" firstStartedPulling="2026-04-23 01:17:47.373479863 +0000 UTC m=+466.548275472" lastFinishedPulling="2026-04-23 01:17:49.566770847 +0000 UTC m=+468.741566470" observedRunningTime="2026-04-23 01:17:49.910605356 +0000 UTC m=+469.085401011" watchObservedRunningTime="2026-04-23 01:17:49.911370151 +0000 UTC m=+469.086165782" Apr 23 01:18:00.902114 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:00.902080 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:18:13.996865 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:13.996832 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv"] Apr 23 01:18:13.997425 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:13.997125 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" containerName="manager" containerID="cri-o://f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d" gracePeriod=2 Apr 23 01:18:14.022959 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.022928 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv"] Apr 23 01:18:14.029109 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.029076 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw"] Apr 23 01:18:14.029596 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.029572 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" containerName="manager" Apr 23 01:18:14.029596 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.029598 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" containerName="manager" Apr 23 01:18:14.029791 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.029707 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" containerName="manager" Apr 23 01:18:14.032789 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.032770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" Apr 23 01:18:14.044126 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.044096 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw"] Apr 23 01:18:14.190745 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.190704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbr7\" (UniqueName: \"kubernetes.io/projected/e6d04b86-cd57-409b-a905-d5596b672fde-kube-api-access-cqbr7\") pod \"limitador-operator-controller-manager-85c4996f8c-skwpw\" (UID: \"e6d04b86-cd57-409b-a905-d5596b672fde\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" Apr 23 01:18:14.228128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.228102 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:18:14.230245 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.230220 2576 status_manager.go:895] "Failed to get status for pod" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" err="pods \"limitador-operator-controller-manager-85c4996f8c-zdjmv\" is forbidden: User \"system:node:ip-10-0-137-21.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-21.ec2.internal' and this object" Apr 23 01:18:14.292041 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.291952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbr7\" (UniqueName: \"kubernetes.io/projected/e6d04b86-cd57-409b-a905-d5596b672fde-kube-api-access-cqbr7\") pod \"limitador-operator-controller-manager-85c4996f8c-skwpw\" (UID: \"e6d04b86-cd57-409b-a905-d5596b672fde\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" Apr 23 01:18:14.303494 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.303460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbr7\" (UniqueName: \"kubernetes.io/projected/e6d04b86-cd57-409b-a905-d5596b672fde-kube-api-access-cqbr7\") pod \"limitador-operator-controller-manager-85c4996f8c-skwpw\" (UID: \"e6d04b86-cd57-409b-a905-d5596b672fde\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" Apr 23 01:18:14.373475 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.373437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" Apr 23 01:18:14.392613 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.392584 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcbkm\" (UniqueName: \"kubernetes.io/projected/d20f6dd9-77af-4551-b287-58486f3b54b8-kube-api-access-fcbkm\") pod \"d20f6dd9-77af-4551-b287-58486f3b54b8\" (UID: \"d20f6dd9-77af-4551-b287-58486f3b54b8\") " Apr 23 01:18:14.394807 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.394781 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20f6dd9-77af-4551-b287-58486f3b54b8-kube-api-access-fcbkm" (OuterVolumeSpecName: "kube-api-access-fcbkm") pod "d20f6dd9-77af-4551-b287-58486f3b54b8" (UID: "d20f6dd9-77af-4551-b287-58486f3b54b8"). InnerVolumeSpecName "kube-api-access-fcbkm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:18:14.493210 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.493182 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcbkm\" (UniqueName: \"kubernetes.io/projected/d20f6dd9-77af-4551-b287-58486f3b54b8-kube-api-access-fcbkm\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:18:14.493895 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.493873 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw"] Apr 23 01:18:14.496809 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:18:14.496783 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d04b86_cd57_409b_a905_d5596b672fde.slice/crio-384ef492a07b5ace6c1389a1970f38daf2356c6485b4188828cb91a811145e51 WatchSource:0}: Error finding container 384ef492a07b5ace6c1389a1970f38daf2356c6485b4188828cb91a811145e51: Status 404 returned error can't find the container with id 384ef492a07b5ace6c1389a1970f38daf2356c6485b4188828cb91a811145e51 Apr 23 01:18:14.981678 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.981642 2576 generic.go:358] "Generic (PLEG): container finished" podID="d20f6dd9-77af-4551-b287-58486f3b54b8" containerID="f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d" exitCode=0 Apr 23 01:18:14.981873 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.981698 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" Apr 23 01:18:14.981873 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.981750 2576 scope.go:117] "RemoveContainer" containerID="f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d" Apr 23 01:18:14.983333 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.983304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" event={"ID":"e6d04b86-cd57-409b-a905-d5596b672fde","Type":"ContainerStarted","Data":"9997133fa940a5d12caf6b6ff507850e1f53007c3a64afbfc47f4a7441fb5b7b"} Apr 23 01:18:14.983442 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.983347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" event={"ID":"e6d04b86-cd57-409b-a905-d5596b672fde","Type":"ContainerStarted","Data":"384ef492a07b5ace6c1389a1970f38daf2356c6485b4188828cb91a811145e51"} Apr 23 01:18:14.983497 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.983450 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" Apr 23 01:18:14.984258 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.984234 2576 status_manager.go:895] "Failed to get status for pod" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" err="pods \"limitador-operator-controller-manager-85c4996f8c-zdjmv\" is forbidden: User \"system:node:ip-10-0-137-21.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-21.ec2.internal' and this object" Apr 23 01:18:14.990172 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.990148 2576 scope.go:117] "RemoveContainer" containerID="f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d" Apr 23 01:18:14.990436 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:18:14.990417 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d\": container with ID starting with f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d not found: ID does not exist" containerID="f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d" Apr 23 01:18:14.990485 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:14.990446 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d"} err="failed to get container status \"f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d\": rpc error: code = NotFound desc = could not find container \"f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d\": container with ID starting with f719290b9e2952d1c7b459b6a6d9d7d3f87dc42734b30c4f90092f29d6d38f3d not found: ID does not exist" Apr 23 01:18:15.004439 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:15.004392 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" podStartSLOduration=1.004377123 podStartE2EDuration="1.004377123s" podCreationTimestamp="2026-04-23 01:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:18:15.002900288 +0000 UTC m=+494.177695923" watchObservedRunningTime="2026-04-23 01:18:15.004377123 +0000 UTC m=+494.179172754" Apr 23 01:18:15.005075 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:15.005045 2576 status_manager.go:895] "Failed to get status for pod" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" err="pods \"limitador-operator-controller-manager-85c4996f8c-zdjmv\" is forbidden: User \"system:node:ip-10-0-137-21.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-21.ec2.internal' and this object" Apr 23 01:18:15.007055 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:15.007028 2576 status_manager.go:895] "Failed to get status for pod" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zdjmv" err="pods \"limitador-operator-controller-manager-85c4996f8c-zdjmv\" is forbidden: User \"system:node:ip-10-0-137-21.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-21.ec2.internal' and this object" Apr 23 01:18:15.385213 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:15.385135 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d20f6dd9-77af-4551-b287-58486f3b54b8" path="/var/lib/kubelet/pods/d20f6dd9-77af-4551-b287-58486f3b54b8/volumes" Apr 23 01:18:25.992491 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:25.992461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-skwpw" Apr 23 01:18:56.220080 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.220041 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8r5h9"] Apr 23 01:18:56.223659 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.223640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:18:56.225859 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.225831 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 23 01:18:56.225999 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.225919 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4dlc4\"" Apr 23 01:18:56.231283 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.231206 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8r5h9"] Apr 23 01:18:56.319108 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.319067 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8r5h9"] Apr 23 01:18:56.341152 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.341113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cccfb1b3-68e0-4330-a83d-48dc25de471c-config-file\") pod \"limitador-limitador-7d549b5b-8r5h9\" (UID: \"cccfb1b3-68e0-4330-a83d-48dc25de471c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:18:56.341304 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.341198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqgk9\" (UniqueName: \"kubernetes.io/projected/cccfb1b3-68e0-4330-a83d-48dc25de471c-kube-api-access-wqgk9\") pod \"limitador-limitador-7d549b5b-8r5h9\" (UID: \"cccfb1b3-68e0-4330-a83d-48dc25de471c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:18:56.442193 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.442155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cccfb1b3-68e0-4330-a83d-48dc25de471c-config-file\") pod \"limitador-limitador-7d549b5b-8r5h9\" (UID: \"cccfb1b3-68e0-4330-a83d-48dc25de471c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:18:56.442361 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.442235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqgk9\" (UniqueName: \"kubernetes.io/projected/cccfb1b3-68e0-4330-a83d-48dc25de471c-kube-api-access-wqgk9\") pod \"limitador-limitador-7d549b5b-8r5h9\" (UID: \"cccfb1b3-68e0-4330-a83d-48dc25de471c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:18:56.442833 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.442812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cccfb1b3-68e0-4330-a83d-48dc25de471c-config-file\") pod \"limitador-limitador-7d549b5b-8r5h9\" (UID: \"cccfb1b3-68e0-4330-a83d-48dc25de471c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:18:56.449771 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.449746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqgk9\" (UniqueName: \"kubernetes.io/projected/cccfb1b3-68e0-4330-a83d-48dc25de471c-kube-api-access-wqgk9\") pod \"limitador-limitador-7d549b5b-8r5h9\" (UID: \"cccfb1b3-68e0-4330-a83d-48dc25de471c\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:18:56.535747 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.535643 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:18:56.656573 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:56.656546 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8r5h9"] Apr 23 01:18:56.659942 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:18:56.659908 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcccfb1b3_68e0_4330_a83d_48dc25de471c.slice/crio-d4c93781fafa91d689dac120669937b2c0d14668ac5a5aed87f79c8897b64c5b WatchSource:0}: Error finding container d4c93781fafa91d689dac120669937b2c0d14668ac5a5aed87f79c8897b64c5b: Status 404 returned error can't find the container with id d4c93781fafa91d689dac120669937b2c0d14668ac5a5aed87f79c8897b64c5b Apr 23 01:18:57.140577 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:18:57.140538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" event={"ID":"cccfb1b3-68e0-4330-a83d-48dc25de471c","Type":"ContainerStarted","Data":"d4c93781fafa91d689dac120669937b2c0d14668ac5a5aed87f79c8897b64c5b"} Apr 23 01:19:00.154264 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:00.154229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" event={"ID":"cccfb1b3-68e0-4330-a83d-48dc25de471c","Type":"ContainerStarted","Data":"d156de4f657c77d84dd3a0f33e39c9e2811328305bc962cf7c39393d28f4ad2a"} Apr 23 01:19:00.154649 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:00.154347 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:19:00.169836 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:00.169780 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" podStartSLOduration=1.461461995 podStartE2EDuration="4.169762542s" podCreationTimestamp="2026-04-23 01:18:56 +0000 UTC" firstStartedPulling="2026-04-23 01:18:56.66218762 +0000 UTC m=+535.836983237" lastFinishedPulling="2026-04-23 01:18:59.370488161 +0000 UTC m=+538.545283784" observedRunningTime="2026-04-23 01:19:00.168183593 +0000 UTC m=+539.342979226" watchObservedRunningTime="2026-04-23 01:19:00.169762542 +0000 UTC m=+539.344558173" Apr 23 01:19:11.158487 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:11.158455 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:19:11.668743 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:11.668705 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8r5h9"] Apr 23 01:19:11.668964 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:11.668937 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" podUID="cccfb1b3-68e0-4330-a83d-48dc25de471c" containerName="limitador" containerID="cri-o://d156de4f657c77d84dd3a0f33e39c9e2811328305bc962cf7c39393d28f4ad2a" gracePeriod=30 Apr 23 01:19:12.195308 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.195276 2576 generic.go:358] "Generic (PLEG): container finished" podID="cccfb1b3-68e0-4330-a83d-48dc25de471c" containerID="d156de4f657c77d84dd3a0f33e39c9e2811328305bc962cf7c39393d28f4ad2a" exitCode=0 Apr 23 01:19:12.195639 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.195351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" event={"ID":"cccfb1b3-68e0-4330-a83d-48dc25de471c","Type":"ContainerDied","Data":"d156de4f657c77d84dd3a0f33e39c9e2811328305bc962cf7c39393d28f4ad2a"} Apr 23 01:19:12.195639 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.195393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" event={"ID":"cccfb1b3-68e0-4330-a83d-48dc25de471c","Type":"ContainerDied","Data":"d4c93781fafa91d689dac120669937b2c0d14668ac5a5aed87f79c8897b64c5b"} Apr 23 01:19:12.195639 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.195404 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4c93781fafa91d689dac120669937b2c0d14668ac5a5aed87f79c8897b64c5b" Apr 23 01:19:12.205664 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.205644 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:19:12.284905 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.284813 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqgk9\" (UniqueName: \"kubernetes.io/projected/cccfb1b3-68e0-4330-a83d-48dc25de471c-kube-api-access-wqgk9\") pod \"cccfb1b3-68e0-4330-a83d-48dc25de471c\" (UID: \"cccfb1b3-68e0-4330-a83d-48dc25de471c\") " Apr 23 01:19:12.284905 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.284877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cccfb1b3-68e0-4330-a83d-48dc25de471c-config-file\") pod \"cccfb1b3-68e0-4330-a83d-48dc25de471c\" (UID: \"cccfb1b3-68e0-4330-a83d-48dc25de471c\") " Apr 23 01:19:12.285311 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.285284 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cccfb1b3-68e0-4330-a83d-48dc25de471c-config-file" (OuterVolumeSpecName: "config-file") pod "cccfb1b3-68e0-4330-a83d-48dc25de471c" (UID: "cccfb1b3-68e0-4330-a83d-48dc25de471c"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:19:12.286877 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.286856 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cccfb1b3-68e0-4330-a83d-48dc25de471c-kube-api-access-wqgk9" (OuterVolumeSpecName: "kube-api-access-wqgk9") pod "cccfb1b3-68e0-4330-a83d-48dc25de471c" (UID: "cccfb1b3-68e0-4330-a83d-48dc25de471c"). InnerVolumeSpecName "kube-api-access-wqgk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:19:12.385860 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.385824 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqgk9\" (UniqueName: \"kubernetes.io/projected/cccfb1b3-68e0-4330-a83d-48dc25de471c-kube-api-access-wqgk9\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:19:12.385860 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.385857 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cccfb1b3-68e0-4330-a83d-48dc25de471c-config-file\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:19:12.873903 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.873866 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-gh2kz"] Apr 23 01:19:12.874314 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.874301 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cccfb1b3-68e0-4330-a83d-48dc25de471c" containerName="limitador" Apr 23 01:19:12.874363 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.874316 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cccfb1b3-68e0-4330-a83d-48dc25de471c" containerName="limitador" Apr 23 01:19:12.874404 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.874375 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cccfb1b3-68e0-4330-a83d-48dc25de471c" containerName="limitador" Apr 23 01:19:12.877939 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.877908 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:12.880178 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.880153 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 23 01:19:12.880324 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.880241 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-t5hdc\"" Apr 23 01:19:12.887001 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.886591 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-gh2kz"] Apr 23 01:19:12.991648 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.991608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6881d7a2-e97b-4c63-af58-69de860a39b6-data\") pod \"postgres-868db5846d-gh2kz\" (UID: \"6881d7a2-e97b-4c63-af58-69de860a39b6\") " pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:12.991834 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:12.991762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pljrt\" (UniqueName: \"kubernetes.io/projected/6881d7a2-e97b-4c63-af58-69de860a39b6-kube-api-access-pljrt\") pod \"postgres-868db5846d-gh2kz\" (UID: \"6881d7a2-e97b-4c63-af58-69de860a39b6\") " pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:13.092916 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.092881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pljrt\" (UniqueName: \"kubernetes.io/projected/6881d7a2-e97b-4c63-af58-69de860a39b6-kube-api-access-pljrt\") pod \"postgres-868db5846d-gh2kz\" (UID: \"6881d7a2-e97b-4c63-af58-69de860a39b6\") " pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:13.093123 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.092943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6881d7a2-e97b-4c63-af58-69de860a39b6-data\") pod \"postgres-868db5846d-gh2kz\" (UID: \"6881d7a2-e97b-4c63-af58-69de860a39b6\") " pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:13.093345 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.093322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6881d7a2-e97b-4c63-af58-69de860a39b6-data\") pod \"postgres-868db5846d-gh2kz\" (UID: \"6881d7a2-e97b-4c63-af58-69de860a39b6\") " pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:13.100156 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.100124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pljrt\" (UniqueName: \"kubernetes.io/projected/6881d7a2-e97b-4c63-af58-69de860a39b6-kube-api-access-pljrt\") pod \"postgres-868db5846d-gh2kz\" (UID: \"6881d7a2-e97b-4c63-af58-69de860a39b6\") " pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:13.192382 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.192293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:13.198681 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.198652 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8r5h9" Apr 23 01:19:13.226152 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.226117 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8r5h9"] Apr 23 01:19:13.229167 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.229141 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8r5h9"] Apr 23 01:19:13.311893 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.311867 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-gh2kz"] Apr 23 01:19:13.314697 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:19:13.314672 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6881d7a2_e97b_4c63_af58_69de860a39b6.slice/crio-79df2fbe1ca7f3df260ebdff95051b1890c83cb31f045ccacd953b216861adc5 WatchSource:0}: Error finding container 79df2fbe1ca7f3df260ebdff95051b1890c83cb31f045ccacd953b216861adc5: Status 404 returned error can't find the container with id 79df2fbe1ca7f3df260ebdff95051b1890c83cb31f045ccacd953b216861adc5 Apr 23 01:19:13.385520 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:13.385478 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cccfb1b3-68e0-4330-a83d-48dc25de471c" path="/var/lib/kubelet/pods/cccfb1b3-68e0-4330-a83d-48dc25de471c/volumes" Apr 23 01:19:14.202809 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:14.202776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-gh2kz" event={"ID":"6881d7a2-e97b-4c63-af58-69de860a39b6","Type":"ContainerStarted","Data":"79df2fbe1ca7f3df260ebdff95051b1890c83cb31f045ccacd953b216861adc5"} Apr 23 01:19:19.225601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:19.225559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-gh2kz" event={"ID":"6881d7a2-e97b-4c63-af58-69de860a39b6","Type":"ContainerStarted","Data":"bbda00f6e1d7ab71d9050299a99ed36f50bc8917bdefa8182fa9d9d77266855a"} Apr 23 01:19:19.226144 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:19.225652 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:19.240645 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:19.240590 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-gh2kz" podStartSLOduration=1.7469159300000001 podStartE2EDuration="7.240572618s" podCreationTimestamp="2026-04-23 01:19:12 +0000 UTC" firstStartedPulling="2026-04-23 01:19:13.31642522 +0000 UTC m=+552.491220830" lastFinishedPulling="2026-04-23 01:19:18.810081908 +0000 UTC m=+557.984877518" observedRunningTime="2026-04-23 01:19:19.239932575 +0000 UTC m=+558.414728207" watchObservedRunningTime="2026-04-23 01:19:19.240572618 +0000 UTC m=+558.415368254" Apr 23 01:19:25.259329 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:25.259291 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-gh2kz" Apr 23 01:19:28.300422 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.300386 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-vr74l"] Apr 23 01:19:28.373290 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.373250 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-vr74l"] Apr 23 01:19:28.373450 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.373269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" Apr 23 01:19:28.375671 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.375641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-kcbm4\"" Apr 23 01:19:28.431357 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.431322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nltzz\" (UniqueName: \"kubernetes.io/projected/8a6ad8b1-a11d-4f44-840e-4e2379818f0c-kube-api-access-nltzz\") pod \"maas-controller-6d4c8f55f9-vr74l\" (UID: \"8a6ad8b1-a11d-4f44-840e-4e2379818f0c\") " pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" Apr 23 01:19:28.462010 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.461946 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d9c69bf64-kbztj"] Apr 23 01:19:28.482085 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.482047 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d9c69bf64-kbztj"] Apr 23 01:19:28.482236 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.482168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:28.532291 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.532255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nltzz\" (UniqueName: \"kubernetes.io/projected/8a6ad8b1-a11d-4f44-840e-4e2379818f0c-kube-api-access-nltzz\") pod \"maas-controller-6d4c8f55f9-vr74l\" (UID: \"8a6ad8b1-a11d-4f44-840e-4e2379818f0c\") " pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" Apr 23 01:19:28.532291 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.532296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plfms\" (UniqueName: \"kubernetes.io/projected/18d32dc7-57e5-452e-8fae-1479b68fd470-kube-api-access-plfms\") pod \"maas-controller-6d9c69bf64-kbztj\" (UID: \"18d32dc7-57e5-452e-8fae-1479b68fd470\") " pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:28.541879 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.541841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nltzz\" (UniqueName: \"kubernetes.io/projected/8a6ad8b1-a11d-4f44-840e-4e2379818f0c-kube-api-access-nltzz\") pod \"maas-controller-6d4c8f55f9-vr74l\" (UID: \"8a6ad8b1-a11d-4f44-840e-4e2379818f0c\") " pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" Apr 23 01:19:28.575253 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.575167 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-vr74l"] Apr 23 01:19:28.575461 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.575448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" Apr 23 01:19:28.600335 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.600279 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-8d66d4b94-dd444"] Apr 23 01:19:28.619946 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.619896 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8d66d4b94-dd444"] Apr 23 01:19:28.620120 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.620087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:28.633744 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.633711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kvn5\" (UniqueName: \"kubernetes.io/projected/9e56a94e-1c91-42e0-8788-73b97908d9f0-kube-api-access-9kvn5\") pod \"maas-controller-8d66d4b94-dd444\" (UID: \"9e56a94e-1c91-42e0-8788-73b97908d9f0\") " pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:28.633867 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.633783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plfms\" (UniqueName: \"kubernetes.io/projected/18d32dc7-57e5-452e-8fae-1479b68fd470-kube-api-access-plfms\") pod \"maas-controller-6d9c69bf64-kbztj\" (UID: \"18d32dc7-57e5-452e-8fae-1479b68fd470\") " pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:28.641527 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.641490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plfms\" (UniqueName: \"kubernetes.io/projected/18d32dc7-57e5-452e-8fae-1479b68fd470-kube-api-access-plfms\") pod \"maas-controller-6d9c69bf64-kbztj\" (UID: \"18d32dc7-57e5-452e-8fae-1479b68fd470\") " pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:28.704138 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.704100 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-vr74l"] Apr 23 01:19:28.735276 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.735241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kvn5\" (UniqueName: \"kubernetes.io/projected/9e56a94e-1c91-42e0-8788-73b97908d9f0-kube-api-access-9kvn5\") pod \"maas-controller-8d66d4b94-dd444\" (UID: \"9e56a94e-1c91-42e0-8788-73b97908d9f0\") " pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:28.743096 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.743065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kvn5\" (UniqueName: \"kubernetes.io/projected/9e56a94e-1c91-42e0-8788-73b97908d9f0-kube-api-access-9kvn5\") pod \"maas-controller-8d66d4b94-dd444\" (UID: \"9e56a94e-1c91-42e0-8788-73b97908d9f0\") " pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:28.793405 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.793366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:28.916943 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.916911 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d9c69bf64-kbztj"] Apr 23 01:19:28.918679 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:19:28.918648 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18d32dc7_57e5_452e_8fae_1479b68fd470.slice/crio-341f9f079fa20c71bd3ae0035487e723fc87a36de8f65db26c782135133a21d2 WatchSource:0}: Error finding container 341f9f079fa20c71bd3ae0035487e723fc87a36de8f65db26c782135133a21d2: Status 404 returned error can't find the container with id 341f9f079fa20c71bd3ae0035487e723fc87a36de8f65db26c782135133a21d2 Apr 23 01:19:28.935390 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:28.935354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:29.063187 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:29.063151 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8d66d4b94-dd444"] Apr 23 01:19:29.066011 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:19:29.065953 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e56a94e_1c91_42e0_8788_73b97908d9f0.slice/crio-2c856d6148eb60431a431264e31f28dd2c001b50a4542893491c5f8ee3ae349e WatchSource:0}: Error finding container 2c856d6148eb60431a431264e31f28dd2c001b50a4542893491c5f8ee3ae349e: Status 404 returned error can't find the container with id 2c856d6148eb60431a431264e31f28dd2c001b50a4542893491c5f8ee3ae349e Apr 23 01:19:29.266726 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:29.266678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8d66d4b94-dd444" event={"ID":"9e56a94e-1c91-42e0-8788-73b97908d9f0","Type":"ContainerStarted","Data":"2c856d6148eb60431a431264e31f28dd2c001b50a4542893491c5f8ee3ae349e"} Apr 23 01:19:29.268029 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:29.267961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" event={"ID":"8a6ad8b1-a11d-4f44-840e-4e2379818f0c","Type":"ContainerStarted","Data":"55f753f4912b9aa70dce8045149186f23b4e8aa655c92448f0578e36db51d418"} Apr 23 01:19:29.269158 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:29.269131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" event={"ID":"18d32dc7-57e5-452e-8fae-1479b68fd470","Type":"ContainerStarted","Data":"341f9f079fa20c71bd3ae0035487e723fc87a36de8f65db26c782135133a21d2"} Apr 23 01:19:33.288190 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.288149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" event={"ID":"18d32dc7-57e5-452e-8fae-1479b68fd470","Type":"ContainerStarted","Data":"08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1"} Apr 23 01:19:33.288687 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.288269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:33.289524 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.289503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8d66d4b94-dd444" event={"ID":"9e56a94e-1c91-42e0-8788-73b97908d9f0","Type":"ContainerStarted","Data":"e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a"} Apr 23 01:19:33.289652 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.289630 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:33.290853 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.290835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" event={"ID":"8a6ad8b1-a11d-4f44-840e-4e2379818f0c","Type":"ContainerStarted","Data":"f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6"} Apr 23 01:19:33.290935 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.290876 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" Apr 23 01:19:33.290935 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.290872 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" podUID="8a6ad8b1-a11d-4f44-840e-4e2379818f0c" containerName="manager" containerID="cri-o://f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6" gracePeriod=10 Apr 23 01:19:33.304269 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.304230 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" podStartSLOduration=1.959421946 podStartE2EDuration="5.304215677s" podCreationTimestamp="2026-04-23 01:19:28 +0000 UTC" firstStartedPulling="2026-04-23 01:19:28.920337514 +0000 UTC m=+568.095133124" lastFinishedPulling="2026-04-23 01:19:32.265131243 +0000 UTC m=+571.439926855" observedRunningTime="2026-04-23 01:19:33.301816715 +0000 UTC m=+572.476612349" watchObservedRunningTime="2026-04-23 01:19:33.304215677 +0000 UTC m=+572.479011309" Apr 23 01:19:33.316251 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.316204 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-8d66d4b94-dd444" podStartSLOduration=2.107923021 podStartE2EDuration="5.316192369s" podCreationTimestamp="2026-04-23 01:19:28 +0000 UTC" firstStartedPulling="2026-04-23 01:19:29.067363542 +0000 UTC m=+568.242159164" lastFinishedPulling="2026-04-23 01:19:32.275632891 +0000 UTC m=+571.450428512" observedRunningTime="2026-04-23 01:19:33.314190069 +0000 UTC m=+572.488985715" watchObservedRunningTime="2026-04-23 01:19:33.316192369 +0000 UTC m=+572.490988001" Apr 23 01:19:33.329170 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.329116 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" podStartSLOduration=1.7741919099999999 podStartE2EDuration="5.329100323s" podCreationTimestamp="2026-04-23 01:19:28 +0000 UTC" firstStartedPulling="2026-04-23 01:19:28.709863289 +0000 UTC m=+567.884658905" lastFinishedPulling="2026-04-23 01:19:32.2647717 +0000 UTC m=+571.439567318" observedRunningTime="2026-04-23 01:19:33.327487262 +0000 UTC m=+572.502282895" watchObservedRunningTime="2026-04-23 01:19:33.329100323 +0000 UTC m=+572.503895954" Apr 23 01:19:33.630651 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.630627 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" Apr 23 01:19:33.682011 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.681953 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nltzz\" (UniqueName: \"kubernetes.io/projected/8a6ad8b1-a11d-4f44-840e-4e2379818f0c-kube-api-access-nltzz\") pod \"8a6ad8b1-a11d-4f44-840e-4e2379818f0c\" (UID: \"8a6ad8b1-a11d-4f44-840e-4e2379818f0c\") " Apr 23 01:19:33.684197 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.684162 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6ad8b1-a11d-4f44-840e-4e2379818f0c-kube-api-access-nltzz" (OuterVolumeSpecName: "kube-api-access-nltzz") pod "8a6ad8b1-a11d-4f44-840e-4e2379818f0c" (UID: "8a6ad8b1-a11d-4f44-840e-4e2379818f0c"). InnerVolumeSpecName "kube-api-access-nltzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:19:33.782935 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:33.782892 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nltzz\" (UniqueName: \"kubernetes.io/projected/8a6ad8b1-a11d-4f44-840e-4e2379818f0c-kube-api-access-nltzz\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:19:34.296170 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.296136 2576 generic.go:358] "Generic (PLEG): container finished" podID="8a6ad8b1-a11d-4f44-840e-4e2379818f0c" containerID="f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6" exitCode=0 Apr 23 01:19:34.296627 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.296201 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" Apr 23 01:19:34.296627 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.296231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" event={"ID":"8a6ad8b1-a11d-4f44-840e-4e2379818f0c","Type":"ContainerDied","Data":"f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6"} Apr 23 01:19:34.296627 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.296271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-vr74l" event={"ID":"8a6ad8b1-a11d-4f44-840e-4e2379818f0c","Type":"ContainerDied","Data":"55f753f4912b9aa70dce8045149186f23b4e8aa655c92448f0578e36db51d418"} Apr 23 01:19:34.296627 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.296292 2576 scope.go:117] "RemoveContainer" containerID="f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6" Apr 23 01:19:34.305245 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.305225 2576 scope.go:117] "RemoveContainer" containerID="f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6" Apr 23 01:19:34.305515 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:19:34.305497 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6\": container with ID starting with f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6 not found: ID does not exist" containerID="f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6" Apr 23 01:19:34.305566 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.305525 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6"} err="failed to get container status \"f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6\": rpc error: code = NotFound desc = could not find container \"f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6\": container with ID starting with f60772e493e85fdcb012ee400ed475851a6b3c11ca700e82ec7178e57b974de6 not found: ID does not exist" Apr 23 01:19:34.316637 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.316565 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-vr74l"] Apr 23 01:19:34.318335 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:34.318311 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-vr74l"] Apr 23 01:19:35.385653 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:35.385615 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6ad8b1-a11d-4f44-840e-4e2379818f0c" path="/var/lib/kubelet/pods/8a6ad8b1-a11d-4f44-840e-4e2379818f0c/volumes" Apr 23 01:19:44.301701 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.301666 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:44.302327 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.302297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:44.354369 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.354332 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d9c69bf64-kbztj"] Apr 23 01:19:44.354611 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.354586 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" podUID="18d32dc7-57e5-452e-8fae-1479b68fd470" containerName="manager" containerID="cri-o://08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1" gracePeriod=10 Apr 23 01:19:44.620756 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.620729 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:44.634334 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.634301 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-xnjvf"] Apr 23 01:19:44.634698 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.634677 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a6ad8b1-a11d-4f44-840e-4e2379818f0c" containerName="manager" Apr 23 01:19:44.634698 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.634696 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6ad8b1-a11d-4f44-840e-4e2379818f0c" containerName="manager" Apr 23 01:19:44.634897 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.634726 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18d32dc7-57e5-452e-8fae-1479b68fd470" containerName="manager" Apr 23 01:19:44.634897 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.634733 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d32dc7-57e5-452e-8fae-1479b68fd470" containerName="manager" Apr 23 01:19:44.634897 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.634816 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a6ad8b1-a11d-4f44-840e-4e2379818f0c" containerName="manager" Apr 23 01:19:44.634897 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.634831 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="18d32dc7-57e5-452e-8fae-1479b68fd470" containerName="manager" Apr 23 01:19:44.638165 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.638136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:19:44.646302 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.646275 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-xnjvf"] Apr 23 01:19:44.687612 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.687574 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plfms\" (UniqueName: \"kubernetes.io/projected/18d32dc7-57e5-452e-8fae-1479b68fd470-kube-api-access-plfms\") pod \"18d32dc7-57e5-452e-8fae-1479b68fd470\" (UID: \"18d32dc7-57e5-452e-8fae-1479b68fd470\") " Apr 23 01:19:44.689662 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.689625 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d32dc7-57e5-452e-8fae-1479b68fd470-kube-api-access-plfms" (OuterVolumeSpecName: "kube-api-access-plfms") pod "18d32dc7-57e5-452e-8fae-1479b68fd470" (UID: "18d32dc7-57e5-452e-8fae-1479b68fd470"). InnerVolumeSpecName "kube-api-access-plfms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:19:44.788348 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.788305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfsrt\" (UniqueName: \"kubernetes.io/projected/34c57ec4-6b59-4ba3-be1c-52b32d6db63d-kube-api-access-dfsrt\") pod \"maas-controller-7ddc769ff4-xnjvf\" (UID: \"34c57ec4-6b59-4ba3-be1c-52b32d6db63d\") " pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:19:44.788533 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.788424 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-plfms\" (UniqueName: \"kubernetes.io/projected/18d32dc7-57e5-452e-8fae-1479b68fd470-kube-api-access-plfms\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:19:44.889219 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.889126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfsrt\" (UniqueName: \"kubernetes.io/projected/34c57ec4-6b59-4ba3-be1c-52b32d6db63d-kube-api-access-dfsrt\") pod \"maas-controller-7ddc769ff4-xnjvf\" (UID: \"34c57ec4-6b59-4ba3-be1c-52b32d6db63d\") " pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:19:44.897349 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.897315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfsrt\" (UniqueName: \"kubernetes.io/projected/34c57ec4-6b59-4ba3-be1c-52b32d6db63d-kube-api-access-dfsrt\") pod \"maas-controller-7ddc769ff4-xnjvf\" (UID: \"34c57ec4-6b59-4ba3-be1c-52b32d6db63d\") " pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:19:44.951232 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:44.951185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:19:45.078264 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.078238 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-xnjvf"] Apr 23 01:19:45.080469 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:19:45.080444 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34c57ec4_6b59_4ba3_be1c_52b32d6db63d.slice/crio-4595a3e1b2c7a8fd72a3729a6bcb4d6b3d799f9850366b251b89137d7315dfa6 WatchSource:0}: Error finding container 4595a3e1b2c7a8fd72a3729a6bcb4d6b3d799f9850366b251b89137d7315dfa6: Status 404 returned error can't find the container with id 4595a3e1b2c7a8fd72a3729a6bcb4d6b3d799f9850366b251b89137d7315dfa6 Apr 23 01:19:45.338717 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.338680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" event={"ID":"34c57ec4-6b59-4ba3-be1c-52b32d6db63d","Type":"ContainerStarted","Data":"4595a3e1b2c7a8fd72a3729a6bcb4d6b3d799f9850366b251b89137d7315dfa6"} Apr 23 01:19:45.339893 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.339864 2576 generic.go:358] "Generic (PLEG): container finished" podID="18d32dc7-57e5-452e-8fae-1479b68fd470" containerID="08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1" exitCode=0 Apr 23 01:19:45.340037 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.339920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" event={"ID":"18d32dc7-57e5-452e-8fae-1479b68fd470","Type":"ContainerDied","Data":"08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1"} Apr 23 01:19:45.340037 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.339941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" event={"ID":"18d32dc7-57e5-452e-8fae-1479b68fd470","Type":"ContainerDied","Data":"341f9f079fa20c71bd3ae0035487e723fc87a36de8f65db26c782135133a21d2"} Apr 23 01:19:45.340037 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.339938 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d9c69bf64-kbztj" Apr 23 01:19:45.340037 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.339962 2576 scope.go:117] "RemoveContainer" containerID="08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1" Apr 23 01:19:45.348494 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.348477 2576 scope.go:117] "RemoveContainer" containerID="08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1" Apr 23 01:19:45.348730 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:19:45.348706 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1\": container with ID starting with 08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1 not found: ID does not exist" containerID="08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1" Apr 23 01:19:45.348784 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.348739 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1"} err="failed to get container status \"08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1\": rpc error: code = NotFound desc = could not find container \"08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1\": container with ID starting with 08e495e6ecafa6d5ed1512f9f81d60db5dd297415c445c775b9c79f2ad772ac1 not found: ID does not exist" Apr 23 01:19:45.360571 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.360544 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d9c69bf64-kbztj"] Apr 23 01:19:45.363362 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.363338 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d9c69bf64-kbztj"] Apr 23 01:19:45.386078 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:45.386049 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d32dc7-57e5-452e-8fae-1479b68fd470" path="/var/lib/kubelet/pods/18d32dc7-57e5-452e-8fae-1479b68fd470/volumes" Apr 23 01:19:46.344443 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:46.344405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" event={"ID":"34c57ec4-6b59-4ba3-be1c-52b32d6db63d","Type":"ContainerStarted","Data":"bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668"} Apr 23 01:19:46.344912 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:46.344493 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:19:46.360003 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:46.359936 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" podStartSLOduration=1.987145501 podStartE2EDuration="2.359918433s" podCreationTimestamp="2026-04-23 01:19:44 +0000 UTC" firstStartedPulling="2026-04-23 01:19:45.081680588 +0000 UTC m=+584.256476199" lastFinishedPulling="2026-04-23 01:19:45.454453515 +0000 UTC m=+584.629249131" observedRunningTime="2026-04-23 01:19:46.357591798 +0000 UTC m=+585.532387430" watchObservedRunningTime="2026-04-23 01:19:46.359918433 +0000 UTC m=+585.534714064" Apr 23 01:19:50.088713 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.088672 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-84f96cb56b-6pqph"] Apr 23 01:19:50.092452 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.092429 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:50.094715 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.094692 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 23 01:19:50.094715 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.094694 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6vbgm\"" Apr 23 01:19:50.094891 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.094694 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 23 01:19:50.103251 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.103227 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-84f96cb56b-6pqph"] Apr 23 01:19:50.239250 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.239210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7r65\" (UniqueName: \"kubernetes.io/projected/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-kube-api-access-s7r65\") pod \"maas-api-84f96cb56b-6pqph\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:50.239414 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.239318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls\") pod \"maas-api-84f96cb56b-6pqph\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:50.340599 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.340507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls\") pod \"maas-api-84f96cb56b-6pqph\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:50.340599 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.340563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7r65\" (UniqueName: \"kubernetes.io/projected/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-kube-api-access-s7r65\") pod \"maas-api-84f96cb56b-6pqph\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:50.340838 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:19:50.340673 2576 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 23 01:19:50.340838 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:19:50.340752 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls podName:8b7c2d5d-6aa0-441b-977f-5fb7cc48e394 nodeName:}" failed. No retries permitted until 2026-04-23 01:19:50.840730044 +0000 UTC m=+590.015525660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls") pod "maas-api-84f96cb56b-6pqph" (UID: "8b7c2d5d-6aa0-441b-977f-5fb7cc48e394") : secret "maas-api-serving-cert" not found Apr 23 01:19:50.348961 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.348929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7r65\" (UniqueName: \"kubernetes.io/projected/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-kube-api-access-s7r65\") pod \"maas-api-84f96cb56b-6pqph\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:50.845538 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.845484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls\") pod \"maas-api-84f96cb56b-6pqph\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:50.847934 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:50.847913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls\") pod \"maas-api-84f96cb56b-6pqph\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:51.003968 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:51.003922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:51.129879 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:51.129853 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-84f96cb56b-6pqph"] Apr 23 01:19:51.131748 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:19:51.131719 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b7c2d5d_6aa0_441b_977f_5fb7cc48e394.slice/crio-2ca30c39bfd8168e24bec45625ea46f357e7154e63bcfa2c35b0ee51e6e85aa0 WatchSource:0}: Error finding container 2ca30c39bfd8168e24bec45625ea46f357e7154e63bcfa2c35b0ee51e6e85aa0: Status 404 returned error can't find the container with id 2ca30c39bfd8168e24bec45625ea46f357e7154e63bcfa2c35b0ee51e6e85aa0 Apr 23 01:19:51.363885 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:51.363846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-84f96cb56b-6pqph" event={"ID":"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394","Type":"ContainerStarted","Data":"2ca30c39bfd8168e24bec45625ea46f357e7154e63bcfa2c35b0ee51e6e85aa0"} Apr 23 01:19:53.375519 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:53.375482 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-84f96cb56b-6pqph" event={"ID":"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394","Type":"ContainerStarted","Data":"23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2"} Apr 23 01:19:53.376037 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:53.375637 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:19:53.390114 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:53.390052 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-84f96cb56b-6pqph" podStartSLOduration=2.121944175 podStartE2EDuration="3.390032585s" podCreationTimestamp="2026-04-23 01:19:50 +0000 UTC" firstStartedPulling="2026-04-23 01:19:51.133234003 +0000 UTC m=+590.308029615" lastFinishedPulling="2026-04-23 01:19:52.401322415 +0000 UTC m=+591.576118025" observedRunningTime="2026-04-23 01:19:53.388900637 +0000 UTC m=+592.563696271" watchObservedRunningTime="2026-04-23 01:19:53.390032585 +0000 UTC m=+592.564828218" Apr 23 01:19:57.354639 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:57.354608 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:19:57.391068 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:57.391031 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8d66d4b94-dd444"] Apr 23 01:19:57.391292 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:57.391255 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-8d66d4b94-dd444" podUID="9e56a94e-1c91-42e0-8788-73b97908d9f0" containerName="manager" containerID="cri-o://e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a" gracePeriod=10 Apr 23 01:19:57.625434 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:57.625411 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:57.811745 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:57.811691 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kvn5\" (UniqueName: \"kubernetes.io/projected/9e56a94e-1c91-42e0-8788-73b97908d9f0-kube-api-access-9kvn5\") pod \"9e56a94e-1c91-42e0-8788-73b97908d9f0\" (UID: \"9e56a94e-1c91-42e0-8788-73b97908d9f0\") " Apr 23 01:19:57.813861 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:57.813832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e56a94e-1c91-42e0-8788-73b97908d9f0-kube-api-access-9kvn5" (OuterVolumeSpecName: "kube-api-access-9kvn5") pod "9e56a94e-1c91-42e0-8788-73b97908d9f0" (UID: "9e56a94e-1c91-42e0-8788-73b97908d9f0"). InnerVolumeSpecName "kube-api-access-9kvn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:19:57.912697 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:57.912598 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kvn5\" (UniqueName: \"kubernetes.io/projected/9e56a94e-1c91-42e0-8788-73b97908d9f0-kube-api-access-9kvn5\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:19:58.394791 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.394751 2576 generic.go:358] "Generic (PLEG): container finished" podID="9e56a94e-1c91-42e0-8788-73b97908d9f0" containerID="e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a" exitCode=0 Apr 23 01:19:58.395231 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.394822 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8d66d4b94-dd444" Apr 23 01:19:58.395231 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.394834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8d66d4b94-dd444" event={"ID":"9e56a94e-1c91-42e0-8788-73b97908d9f0","Type":"ContainerDied","Data":"e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a"} Apr 23 01:19:58.395231 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.394878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8d66d4b94-dd444" event={"ID":"9e56a94e-1c91-42e0-8788-73b97908d9f0","Type":"ContainerDied","Data":"2c856d6148eb60431a431264e31f28dd2c001b50a4542893491c5f8ee3ae349e"} Apr 23 01:19:58.395231 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.394895 2576 scope.go:117] "RemoveContainer" containerID="e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a" Apr 23 01:19:58.403108 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.403086 2576 scope.go:117] "RemoveContainer" containerID="e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a" Apr 23 01:19:58.403394 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:19:58.403374 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a\": container with ID starting with e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a not found: ID does not exist" containerID="e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a" Apr 23 01:19:58.403443 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.403405 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a"} err="failed to get container status \"e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a\": rpc error: code = NotFound desc = could not find container \"e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a\": container with ID starting with e9dd94008c8f1cfc250243816728a2f290fb10594426ddee12e509f5bc9ceb0a not found: ID does not exist" Apr 23 01:19:58.414274 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.414245 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8d66d4b94-dd444"] Apr 23 01:19:58.417743 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:58.417718 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-8d66d4b94-dd444"] Apr 23 01:19:59.385282 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:59.385245 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e56a94e-1c91-42e0-8788-73b97908d9f0" path="/var/lib/kubelet/pods/9e56a94e-1c91-42e0-8788-73b97908d9f0/volumes" Apr 23 01:19:59.385639 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:19:59.385621 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:20:01.302060 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:01.302037 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:20:01.303038 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:01.303016 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:20:16.897973 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:16.897855 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6d754fdd4d-jv4nj"] Apr 23 01:20:16.900104 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:16.898255 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e56a94e-1c91-42e0-8788-73b97908d9f0" containerName="manager" Apr 23 01:20:16.900104 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:16.898267 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e56a94e-1c91-42e0-8788-73b97908d9f0" containerName="manager" Apr 23 01:20:16.900104 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:16.898331 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e56a94e-1c91-42e0-8788-73b97908d9f0" containerName="manager" Apr 23 01:20:16.900243 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:16.900104 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:16.908665 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:16.908639 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6d754fdd4d-jv4nj"] Apr 23 01:20:16.962549 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:16.962506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmv2x\" (UniqueName: \"kubernetes.io/projected/f03e77fa-3299-4f51-9eac-ef078bb6754e-kube-api-access-vmv2x\") pod \"maas-api-6d754fdd4d-jv4nj\" (UID: \"f03e77fa-3299-4f51-9eac-ef078bb6754e\") " pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:16.962740 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:16.962686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f03e77fa-3299-4f51-9eac-ef078bb6754e-maas-api-tls\") pod \"maas-api-6d754fdd4d-jv4nj\" (UID: \"f03e77fa-3299-4f51-9eac-ef078bb6754e\") " pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:17.063390 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:17.063354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f03e77fa-3299-4f51-9eac-ef078bb6754e-maas-api-tls\") pod \"maas-api-6d754fdd4d-jv4nj\" (UID: \"f03e77fa-3299-4f51-9eac-ef078bb6754e\") " pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:17.063558 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:17.063401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmv2x\" (UniqueName: \"kubernetes.io/projected/f03e77fa-3299-4f51-9eac-ef078bb6754e-kube-api-access-vmv2x\") pod \"maas-api-6d754fdd4d-jv4nj\" (UID: \"f03e77fa-3299-4f51-9eac-ef078bb6754e\") " pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:17.065792 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:17.065760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f03e77fa-3299-4f51-9eac-ef078bb6754e-maas-api-tls\") pod \"maas-api-6d754fdd4d-jv4nj\" (UID: \"f03e77fa-3299-4f51-9eac-ef078bb6754e\") " pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:17.071110 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:17.071084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmv2x\" (UniqueName: \"kubernetes.io/projected/f03e77fa-3299-4f51-9eac-ef078bb6754e-kube-api-access-vmv2x\") pod \"maas-api-6d754fdd4d-jv4nj\" (UID: \"f03e77fa-3299-4f51-9eac-ef078bb6754e\") " pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:17.212074 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:17.212037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:17.336435 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:17.336405 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6d754fdd4d-jv4nj"] Apr 23 01:20:17.338414 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:20:17.338374 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf03e77fa_3299_4f51_9eac_ef078bb6754e.slice/crio-ddce0d16e1fcab2bb7919275becf49eb319c81d1a7a057fab3a85d7a446e8529 WatchSource:0}: Error finding container ddce0d16e1fcab2bb7919275becf49eb319c81d1a7a057fab3a85d7a446e8529: Status 404 returned error can't find the container with id ddce0d16e1fcab2bb7919275becf49eb319c81d1a7a057fab3a85d7a446e8529 Apr 23 01:20:17.339788 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:17.339768 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:20:17.485111 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:17.485022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6d754fdd4d-jv4nj" event={"ID":"f03e77fa-3299-4f51-9eac-ef078bb6754e","Type":"ContainerStarted","Data":"ddce0d16e1fcab2bb7919275becf49eb319c81d1a7a057fab3a85d7a446e8529"} Apr 23 01:20:19.492824 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:19.492784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6d754fdd4d-jv4nj" event={"ID":"f03e77fa-3299-4f51-9eac-ef078bb6754e","Type":"ContainerStarted","Data":"f951d77c810e84b8e1524d549718bbabf2b8cbc9df77dc8c00f51e7b62dc779f"} Apr 23 01:20:19.493255 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:19.492888 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:19.508578 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:19.508521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6d754fdd4d-jv4nj" podStartSLOduration=1.8254394299999999 podStartE2EDuration="3.508508184s" podCreationTimestamp="2026-04-23 01:20:16 +0000 UTC" firstStartedPulling="2026-04-23 01:20:17.339951747 +0000 UTC m=+616.514747358" lastFinishedPulling="2026-04-23 01:20:19.023020488 +0000 UTC m=+618.197816112" observedRunningTime="2026-04-23 01:20:19.507723392 +0000 UTC m=+618.682519047" watchObservedRunningTime="2026-04-23 01:20:19.508508184 +0000 UTC m=+618.683303814" Apr 23 01:20:22.868419 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.868380 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc"] Apr 23 01:20:22.871321 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.871295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:22.873497 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.873463 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 23 01:20:22.873497 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.873465 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-2r2t2\"" Apr 23 01:20:22.873739 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.873476 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 23 01:20:22.873739 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.873576 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 23 01:20:22.881668 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.881645 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc"] Apr 23 01:20:22.915066 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.915027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs4rf\" (UniqueName: \"kubernetes.io/projected/956ecec2-c7b5-4b58-9928-248140788534-kube-api-access-rs4rf\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:22.915233 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.915072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/956ecec2-c7b5-4b58-9928-248140788534-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:22.915233 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.915145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:22.915233 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.915169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:22.915233 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.915196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:22.915367 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:22.915304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016372 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016573 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016573 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs4rf\" (UniqueName: \"kubernetes.io/projected/956ecec2-c7b5-4b58-9928-248140788534-kube-api-access-rs4rf\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016573 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/956ecec2-c7b5-4b58-9928-248140788534-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016573 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016792 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016915 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016969 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.016969 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.016936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.018784 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.018759 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/956ecec2-c7b5-4b58-9928-248140788534-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.019043 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.019028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/956ecec2-c7b5-4b58-9928-248140788534-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.023851 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.023807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs4rf\" (UniqueName: \"kubernetes.io/projected/956ecec2-c7b5-4b58-9928-248140788534-kube-api-access-rs4rf\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-47bcc\" (UID: \"956ecec2-c7b5-4b58-9928-248140788534\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.181105 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.181013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:23.312172 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.311933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc"] Apr 23 01:20:23.314644 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:20:23.314616 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod956ecec2_c7b5_4b58_9928_248140788534.slice/crio-2676ed830a0bfb5bcb5506aa5da69fd94050ad667b36afffe9777afd91e32ae0 WatchSource:0}: Error finding container 2676ed830a0bfb5bcb5506aa5da69fd94050ad667b36afffe9777afd91e32ae0: Status 404 returned error can't find the container with id 2676ed830a0bfb5bcb5506aa5da69fd94050ad667b36afffe9777afd91e32ae0 Apr 23 01:20:23.508401 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:23.508364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" event={"ID":"956ecec2-c7b5-4b58-9928-248140788534","Type":"ContainerStarted","Data":"2676ed830a0bfb5bcb5506aa5da69fd94050ad667b36afffe9777afd91e32ae0"} Apr 23 01:20:25.503571 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.503538 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6d754fdd4d-jv4nj" Apr 23 01:20:25.540770 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.540706 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-84f96cb56b-6pqph"] Apr 23 01:20:25.541116 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.541083 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-84f96cb56b-6pqph" podUID="8b7c2d5d-6aa0-441b-977f-5fb7cc48e394" containerName="maas-api" containerID="cri-o://23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2" gracePeriod=30 Apr 23 01:20:25.780802 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.780774 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:20:25.844893 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.844865 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7r65\" (UniqueName: \"kubernetes.io/projected/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-kube-api-access-s7r65\") pod \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " Apr 23 01:20:25.845094 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.844977 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls\") pod \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\" (UID: \"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394\") " Apr 23 01:20:25.847347 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.847300 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-kube-api-access-s7r65" (OuterVolumeSpecName: "kube-api-access-s7r65") pod "8b7c2d5d-6aa0-441b-977f-5fb7cc48e394" (UID: "8b7c2d5d-6aa0-441b-977f-5fb7cc48e394"). InnerVolumeSpecName "kube-api-access-s7r65". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:20:25.847450 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.847424 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "8b7c2d5d-6aa0-441b-977f-5fb7cc48e394" (UID: "8b7c2d5d-6aa0-441b-977f-5fb7cc48e394"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:20:25.946106 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.946051 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7r65\" (UniqueName: \"kubernetes.io/projected/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-kube-api-access-s7r65\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:20:25.946106 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:25.946093 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394-maas-api-tls\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:20:26.522819 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.522782 2576 generic.go:358] "Generic (PLEG): container finished" podID="8b7c2d5d-6aa0-441b-977f-5fb7cc48e394" containerID="23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2" exitCode=0 Apr 23 01:20:26.523292 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.522850 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-84f96cb56b-6pqph" Apr 23 01:20:26.523292 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.522876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-84f96cb56b-6pqph" event={"ID":"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394","Type":"ContainerDied","Data":"23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2"} Apr 23 01:20:26.523292 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.522929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-84f96cb56b-6pqph" event={"ID":"8b7c2d5d-6aa0-441b-977f-5fb7cc48e394","Type":"ContainerDied","Data":"2ca30c39bfd8168e24bec45625ea46f357e7154e63bcfa2c35b0ee51e6e85aa0"} Apr 23 01:20:26.523292 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.522950 2576 scope.go:117] "RemoveContainer" containerID="23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2" Apr 23 01:20:26.537096 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.537074 2576 scope.go:117] "RemoveContainer" containerID="23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2" Apr 23 01:20:26.537480 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:20:26.537434 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2\": container with ID starting with 23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2 not found: ID does not exist" containerID="23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2" Apr 23 01:20:26.537595 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.537475 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2"} err="failed to get container status \"23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2\": rpc error: code = NotFound desc = could not find container \"23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2\": container with ID starting with 23cc302a8dccc00c091a1e73636625d872e70be96ad8fa7c3e8c67fac8fc85e2 not found: ID does not exist" Apr 23 01:20:26.548025 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.547952 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-84f96cb56b-6pqph"] Apr 23 01:20:26.550724 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:26.550698 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-84f96cb56b-6pqph"] Apr 23 01:20:27.386964 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:27.386763 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b7c2d5d-6aa0-441b-977f-5fb7cc48e394" path="/var/lib/kubelet/pods/8b7c2d5d-6aa0-441b-977f-5fb7cc48e394/volumes" Apr 23 01:20:30.541255 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.541217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" event={"ID":"956ecec2-c7b5-4b58-9928-248140788534","Type":"ContainerStarted","Data":"2ed118e0b16903315ae6715ee3257331885edd4f674d9c2712fcbbfab3d14db8"} Apr 23 01:20:30.567524 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.567486 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd"] Apr 23 01:20:30.568085 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.568063 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b7c2d5d-6aa0-441b-977f-5fb7cc48e394" containerName="maas-api" Apr 23 01:20:30.568085 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.568086 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7c2d5d-6aa0-441b-977f-5fb7cc48e394" containerName="maas-api" Apr 23 01:20:30.568264 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.568184 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b7c2d5d-6aa0-441b-977f-5fb7cc48e394" containerName="maas-api" Apr 23 01:20:30.570245 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.570224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.573196 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.573170 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 23 01:20:30.579205 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.579176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd"] Apr 23 01:20:30.690563 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.690527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.690746 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.690578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.690746 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.690606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.690746 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.690630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8p7t\" (UniqueName: \"kubernetes.io/projected/3d286971-ed11-4ddf-bad8-1b5db9d62370-kube-api-access-f8p7t\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.690746 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.690719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d286971-ed11-4ddf-bad8-1b5db9d62370-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.690903 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.690754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792096 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d286971-ed11-4ddf-bad8-1b5db9d62370-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792096 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792096 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792371 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792371 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792371 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8p7t\" (UniqueName: \"kubernetes.io/projected/3d286971-ed11-4ddf-bad8-1b5db9d62370-kube-api-access-f8p7t\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792555 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792624 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.792729 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.792690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.794527 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.794500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d286971-ed11-4ddf-bad8-1b5db9d62370-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.794825 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.794805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d286971-ed11-4ddf-bad8-1b5db9d62370-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.799150 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.799131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8p7t\" (UniqueName: \"kubernetes.io/projected/3d286971-ed11-4ddf-bad8-1b5db9d62370-kube-api-access-f8p7t\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd\" (UID: \"3d286971-ed11-4ddf-bad8-1b5db9d62370\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:30.885259 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:30.885203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:31.046473 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:31.046307 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd"] Apr 23 01:20:31.049547 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:20:31.049492 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d286971_ed11_4ddf_bad8_1b5db9d62370.slice/crio-9b65e8b7cb883cddeac283e0d910590118b4aa2dc0974ce8752c09ffedfcd130 WatchSource:0}: Error finding container 9b65e8b7cb883cddeac283e0d910590118b4aa2dc0974ce8752c09ffedfcd130: Status 404 returned error can't find the container with id 9b65e8b7cb883cddeac283e0d910590118b4aa2dc0974ce8752c09ffedfcd130 Apr 23 01:20:31.547587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:31.547548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" event={"ID":"3d286971-ed11-4ddf-bad8-1b5db9d62370","Type":"ContainerStarted","Data":"a36b79c6af6ac0c0014f8ebfba63b702deef35d3935a152f1d9bfe6685e89dec"} Apr 23 01:20:31.547587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:31.547589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" event={"ID":"3d286971-ed11-4ddf-bad8-1b5db9d62370","Type":"ContainerStarted","Data":"9b65e8b7cb883cddeac283e0d910590118b4aa2dc0974ce8752c09ffedfcd130"} Apr 23 01:20:37.573259 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:37.573216 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d286971-ed11-4ddf-bad8-1b5db9d62370" containerID="a36b79c6af6ac0c0014f8ebfba63b702deef35d3935a152f1d9bfe6685e89dec" exitCode=0 Apr 23 01:20:37.573672 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:37.573286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" event={"ID":"3d286971-ed11-4ddf-bad8-1b5db9d62370","Type":"ContainerDied","Data":"a36b79c6af6ac0c0014f8ebfba63b702deef35d3935a152f1d9bfe6685e89dec"} Apr 23 01:20:39.583708 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:39.583672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" event={"ID":"3d286971-ed11-4ddf-bad8-1b5db9d62370","Type":"ContainerStarted","Data":"a28d5483e5f4437e7f81e0d403eae9b0c731228a2c61e077fa663830797df3b1"} Apr 23 01:20:39.584176 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:39.583893 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:20:39.585004 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:39.584967 2576 generic.go:358] "Generic (PLEG): container finished" podID="956ecec2-c7b5-4b58-9928-248140788534" containerID="2ed118e0b16903315ae6715ee3257331885edd4f674d9c2712fcbbfab3d14db8" exitCode=0 Apr 23 01:20:39.585099 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:39.585021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" event={"ID":"956ecec2-c7b5-4b58-9928-248140788534","Type":"ContainerDied","Data":"2ed118e0b16903315ae6715ee3257331885edd4f674d9c2712fcbbfab3d14db8"} Apr 23 01:20:39.600956 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:39.600895 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" podStartSLOduration=8.480721395 podStartE2EDuration="9.600878066s" podCreationTimestamp="2026-04-23 01:20:30 +0000 UTC" firstStartedPulling="2026-04-23 01:20:37.574106777 +0000 UTC m=+636.748902387" lastFinishedPulling="2026-04-23 01:20:38.694263443 +0000 UTC m=+637.869059058" observedRunningTime="2026-04-23 01:20:39.599401029 +0000 UTC m=+638.774196658" watchObservedRunningTime="2026-04-23 01:20:39.600878066 +0000 UTC m=+638.775673699" Apr 23 01:20:40.590645 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:40.590607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" event={"ID":"956ecec2-c7b5-4b58-9928-248140788534","Type":"ContainerStarted","Data":"327761dc2051309814c693db94f903ce8ae4ae3b31605d5c591c09cb08aee285"} Apr 23 01:20:40.608051 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:40.607964 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" podStartSLOduration=2.1559627949999998 podStartE2EDuration="18.607942965s" podCreationTimestamp="2026-04-23 01:20:22 +0000 UTC" firstStartedPulling="2026-04-23 01:20:23.316407867 +0000 UTC m=+622.491203478" lastFinishedPulling="2026-04-23 01:20:39.768388038 +0000 UTC m=+638.943183648" observedRunningTime="2026-04-23 01:20:40.60576577 +0000 UTC m=+639.780561417" watchObservedRunningTime="2026-04-23 01:20:40.607942965 +0000 UTC m=+639.782738598" Apr 23 01:20:41.980222 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:41.980164 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4"] Apr 23 01:20:41.983385 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:41.983367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:41.985577 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:41.985554 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 23 01:20:41.991293 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:41.990939 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4"] Apr 23 01:20:42.099426 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.099383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.099623 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.099443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvhz6\" (UniqueName: \"kubernetes.io/projected/622c698c-073b-423c-a74d-f6e6d2f6a03c-kube-api-access-rvhz6\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.099623 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.099546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.099623 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.099590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.099759 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.099658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.099759 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.099703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/622c698c-073b-423c-a74d-f6e6d2f6a03c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.200662 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.200621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.200662 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.200671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.200844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.200702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/622c698c-073b-423c-a74d-f6e6d2f6a03c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.200844 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.200740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.200925 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.200883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvhz6\" (UniqueName: \"kubernetes.io/projected/622c698c-073b-423c-a74d-f6e6d2f6a03c-kube-api-access-rvhz6\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.200966 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.200951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.201153 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.201127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.201153 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.201146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.201345 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.201322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.203171 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.203142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/622c698c-073b-423c-a74d-f6e6d2f6a03c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.203258 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.203236 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/622c698c-073b-423c-a74d-f6e6d2f6a03c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.208008 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.207967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvhz6\" (UniqueName: \"kubernetes.io/projected/622c698c-073b-423c-a74d-f6e6d2f6a03c-kube-api-access-rvhz6\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4\" (UID: \"622c698c-073b-423c-a74d-f6e6d2f6a03c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.295810 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.295716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:42.427711 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.423557 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4"] Apr 23 01:20:42.599601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.599513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" event={"ID":"622c698c-073b-423c-a74d-f6e6d2f6a03c","Type":"ContainerStarted","Data":"988366e1193fb02fbe4b08ca1f7eef5e8918b899fc4f0814dd70616ec71bf86b"} Apr 23 01:20:42.599601 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:42.599559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" event={"ID":"622c698c-073b-423c-a74d-f6e6d2f6a03c","Type":"ContainerStarted","Data":"cc8b2a77e99e97cffe1b021bbb5d9fd2897c3131c9c0025bf0ce2d3e8af03751"} Apr 23 01:20:48.621367 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:48.621332 2576 generic.go:358] "Generic (PLEG): container finished" podID="622c698c-073b-423c-a74d-f6e6d2f6a03c" containerID="988366e1193fb02fbe4b08ca1f7eef5e8918b899fc4f0814dd70616ec71bf86b" exitCode=0 Apr 23 01:20:48.621836 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:48.621384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" event={"ID":"622c698c-073b-423c-a74d-f6e6d2f6a03c","Type":"ContainerDied","Data":"988366e1193fb02fbe4b08ca1f7eef5e8918b899fc4f0814dd70616ec71bf86b"} Apr 23 01:20:49.627790 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:49.627746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" event={"ID":"622c698c-073b-423c-a74d-f6e6d2f6a03c","Type":"ContainerStarted","Data":"04dd786aee8a7ec5e17ce372da214e725511fabef93fab804ae1c4bce47088c8"} Apr 23 01:20:49.628281 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:49.628015 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:20:49.646418 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:49.646352 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" podStartSLOduration=8.250447116 podStartE2EDuration="8.646333257s" podCreationTimestamp="2026-04-23 01:20:41 +0000 UTC" firstStartedPulling="2026-04-23 01:20:48.622028628 +0000 UTC m=+647.796824237" lastFinishedPulling="2026-04-23 01:20:49.017914764 +0000 UTC m=+648.192710378" observedRunningTime="2026-04-23 01:20:49.645217618 +0000 UTC m=+648.820013252" watchObservedRunningTime="2026-04-23 01:20:49.646333257 +0000 UTC m=+648.821128889" Apr 23 01:20:50.591849 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:50.591807 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:50.611106 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:50.611073 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-47bcc" Apr 23 01:20:50.611303 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:20:50.611285 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd" Apr 23 01:21:00.645379 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:21:00.645342 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4" Apr 23 01:23:10.726970 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:10.726923 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-xnjvf"] Apr 23 01:23:10.727520 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:10.727277 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" podUID="34c57ec4-6b59-4ba3-be1c-52b32d6db63d" containerName="manager" containerID="cri-o://bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668" gracePeriod=10 Apr 23 01:23:10.969922 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:10.969897 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:23:11.067087 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.066932 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfsrt\" (UniqueName: \"kubernetes.io/projected/34c57ec4-6b59-4ba3-be1c-52b32d6db63d-kube-api-access-dfsrt\") pod \"34c57ec4-6b59-4ba3-be1c-52b32d6db63d\" (UID: \"34c57ec4-6b59-4ba3-be1c-52b32d6db63d\") " Apr 23 01:23:11.069090 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.069068 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c57ec4-6b59-4ba3-be1c-52b32d6db63d-kube-api-access-dfsrt" (OuterVolumeSpecName: "kube-api-access-dfsrt") pod "34c57ec4-6b59-4ba3-be1c-52b32d6db63d" (UID: "34c57ec4-6b59-4ba3-be1c-52b32d6db63d"). InnerVolumeSpecName "kube-api-access-dfsrt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:23:11.138532 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.138501 2576 generic.go:358] "Generic (PLEG): container finished" podID="34c57ec4-6b59-4ba3-be1c-52b32d6db63d" containerID="bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668" exitCode=0 Apr 23 01:23:11.138532 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.138542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" event={"ID":"34c57ec4-6b59-4ba3-be1c-52b32d6db63d","Type":"ContainerDied","Data":"bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668"} Apr 23 01:23:11.138757 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.138563 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" event={"ID":"34c57ec4-6b59-4ba3-be1c-52b32d6db63d","Type":"ContainerDied","Data":"4595a3e1b2c7a8fd72a3729a6bcb4d6b3d799f9850366b251b89137d7315dfa6"} Apr 23 01:23:11.138757 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.138571 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7ddc769ff4-xnjvf" Apr 23 01:23:11.138757 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.138579 2576 scope.go:117] "RemoveContainer" containerID="bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668" Apr 23 01:23:11.147438 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.147420 2576 scope.go:117] "RemoveContainer" containerID="bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668" Apr 23 01:23:11.147690 ip-10-0-137-21 kubenswrapper[2576]: E0423 01:23:11.147673 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668\": container with ID starting with bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668 not found: ID does not exist" containerID="bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668" Apr 23 01:23:11.147735 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.147697 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668"} err="failed to get container status \"bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668\": rpc error: code = NotFound desc = could not find container \"bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668\": container with ID starting with bc0ef47bcc0a10c80d562a2e205e627157c308d03a87d776a894bde3daa95668 not found: ID does not exist" Apr 23 01:23:11.158215 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.158185 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-xnjvf"] Apr 23 01:23:11.161240 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.161211 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-xnjvf"] Apr 23 01:23:11.168593 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.168570 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dfsrt\" (UniqueName: \"kubernetes.io/projected/34c57ec4-6b59-4ba3-be1c-52b32d6db63d-kube-api-access-dfsrt\") on node \"ip-10-0-137-21.ec2.internal\" DevicePath \"\"" Apr 23 01:23:11.385514 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.385435 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c57ec4-6b59-4ba3-be1c-52b32d6db63d" path="/var/lib/kubelet/pods/34c57ec4-6b59-4ba3-be1c-52b32d6db63d/volumes" Apr 23 01:23:11.867041 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.867002 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-gh2mj"] Apr 23 01:23:11.867433 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.867386 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34c57ec4-6b59-4ba3-be1c-52b32d6db63d" containerName="manager" Apr 23 01:23:11.867433 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.867397 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c57ec4-6b59-4ba3-be1c-52b32d6db63d" containerName="manager" Apr 23 01:23:11.867508 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.867471 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="34c57ec4-6b59-4ba3-be1c-52b32d6db63d" containerName="manager" Apr 23 01:23:11.871821 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.871797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" Apr 23 01:23:11.873995 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.873960 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-kcbm4\"" Apr 23 01:23:11.886186 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.886149 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-gh2mj"] Apr 23 01:23:11.974895 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:11.974837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzh5j\" (UniqueName: \"kubernetes.io/projected/b7ec086f-0602-4318-8731-5b090e77f4cb-kube-api-access-qzh5j\") pod \"maas-controller-7ddc769ff4-gh2mj\" (UID: \"b7ec086f-0602-4318-8731-5b090e77f4cb\") " pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" Apr 23 01:23:12.076311 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:12.076266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzh5j\" (UniqueName: \"kubernetes.io/projected/b7ec086f-0602-4318-8731-5b090e77f4cb-kube-api-access-qzh5j\") pod \"maas-controller-7ddc769ff4-gh2mj\" (UID: \"b7ec086f-0602-4318-8731-5b090e77f4cb\") " pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" Apr 23 01:23:12.084157 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:12.084125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzh5j\" (UniqueName: \"kubernetes.io/projected/b7ec086f-0602-4318-8731-5b090e77f4cb-kube-api-access-qzh5j\") pod \"maas-controller-7ddc769ff4-gh2mj\" (UID: \"b7ec086f-0602-4318-8731-5b090e77f4cb\") " pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" Apr 23 01:23:12.184497 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:12.184410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" Apr 23 01:23:12.306263 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:12.306232 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7ddc769ff4-gh2mj"] Apr 23 01:23:12.309340 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:23:12.309309 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7ec086f_0602_4318_8731_5b090e77f4cb.slice/crio-d5e078846a3ba1b8ef91bd722f6bc13fab3a0f1028e1d9e2a1c57aea092dd5c9 WatchSource:0}: Error finding container d5e078846a3ba1b8ef91bd722f6bc13fab3a0f1028e1d9e2a1c57aea092dd5c9: Status 404 returned error can't find the container with id d5e078846a3ba1b8ef91bd722f6bc13fab3a0f1028e1d9e2a1c57aea092dd5c9 Apr 23 01:23:13.148614 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:13.148522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" event={"ID":"b7ec086f-0602-4318-8731-5b090e77f4cb","Type":"ContainerStarted","Data":"a79263f887aa0180d764d22ea0313c3b246b5400d7185c3f552a2bcb879e2f0b"} Apr 23 01:23:13.148614 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:13.148562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" event={"ID":"b7ec086f-0602-4318-8731-5b090e77f4cb","Type":"ContainerStarted","Data":"d5e078846a3ba1b8ef91bd722f6bc13fab3a0f1028e1d9e2a1c57aea092dd5c9"} Apr 23 01:23:13.149030 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:13.148653 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" Apr 23 01:23:13.163570 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:13.163521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" podStartSLOduration=1.7289038479999999 podStartE2EDuration="2.163508325s" podCreationTimestamp="2026-04-23 01:23:11 +0000 UTC" firstStartedPulling="2026-04-23 01:23:12.310554417 +0000 UTC m=+791.485350027" lastFinishedPulling="2026-04-23 01:23:12.745158891 +0000 UTC m=+791.919954504" observedRunningTime="2026-04-23 01:23:13.161840053 +0000 UTC m=+792.336635707" watchObservedRunningTime="2026-04-23 01:23:13.163508325 +0000 UTC m=+792.338303957" Apr 23 01:23:24.157427 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:23:24.157392 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7ddc769ff4-gh2mj" Apr 23 01:25:01.328898 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:25:01.328818 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:25:01.330627 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:25:01.330602 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:25:01.371338 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:25:01.371304 2576 scope.go:117] "RemoveContainer" containerID="d156de4f657c77d84dd3a0f33e39c9e2811328305bc962cf7c39393d28f4ad2a" Apr 23 01:30:01.356070 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:30:01.356037 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:30:01.360074 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:30:01.360053 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:35:01.382815 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:35:01.382784 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:35:01.387748 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:35:01.387724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:40:01.409811 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:40:01.409780 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:40:01.414402 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:40:01.414382 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:44:09.217128 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:09.217088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-ffwjb_601a9f2f-90f1-4817-93d6-7215a72f670e/manager/0.log" Apr 23 01:44:09.328723 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:09.328687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6d754fdd4d-jv4nj_f03e77fa-3299-4f51-9eac-ef078bb6754e/maas-api/0.log" Apr 23 01:44:09.436324 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:09.436284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7ddc769ff4-gh2mj_b7ec086f-0602-4318-8731-5b090e77f4cb/manager/0.log" Apr 23 01:44:09.548603 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:09.548504 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-pmk8s_86503cb2-857e-4924-af95-6f47288ea75d/manager/1.log" Apr 23 01:44:09.901284 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:09.901186 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5fb5768b86-sk657_11222ec8-cf6a-464c-933d-8706b7de04c5/manager/0.log" Apr 23 01:44:10.007391 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:10.007353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-gh2kz_6881d7a2-e97b-4c63-af58-69de860a39b6/postgres/0.log" Apr 23 01:44:12.003750 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:12.003668 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-skwpw_e6d04b86-cd57-409b-a905-d5596b672fde/manager/0.log" Apr 23 01:44:12.450768 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:12.450683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fjhtp_2612f981-16aa-4f30-9b65-9ccad11b8949/discovery/0.log" Apr 23 01:44:12.661240 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:12.661208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-8596599875-qbwwc_e3a1be14-d458-4b2d-ab99-162e9f69c4ad/kube-auth-proxy/0.log" Apr 23 01:44:13.207888 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:13.207849 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd_3d286971-ed11-4ddf-bad8-1b5db9d62370/storage-initializer/0.log" Apr 23 01:44:13.214999 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:13.214955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-f58pd_3d286971-ed11-4ddf-bad8-1b5db9d62370/main/0.log" Apr 23 01:44:13.549235 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:13.549126 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4_622c698c-073b-423c-a74d-f6e6d2f6a03c/storage-initializer/0.log" Apr 23 01:44:13.555591 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:13.555553 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccklpg4_622c698c-073b-423c-a74d-f6e6d2f6a03c/main/0.log" Apr 23 01:44:13.777337 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:13.777305 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-47bcc_956ecec2-c7b5-4b58-9928-248140788534/storage-initializer/0.log" Apr 23 01:44:13.783831 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:13.783805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-47bcc_956ecec2-c7b5-4b58-9928-248140788534/main/0.log" Apr 23 01:44:20.491806 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:20.491763 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7hf77_4230f793-aa40-4a76-8cb9-6d4d5a0ea43b/global-pull-secret-syncer/0.log" Apr 23 01:44:20.623996 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:20.623948 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tsrhr_e8cf828b-7774-4149-88cf-e198ac5cf943/konnectivity-agent/0.log" Apr 23 01:44:20.695716 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:20.695664 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-21.ec2.internal_e27bb1972aa991595ea629c18476d369/haproxy/0.log" Apr 23 01:44:25.141863 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:25.141824 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-skwpw_e6d04b86-cd57-409b-a905-d5596b672fde/manager/0.log" Apr 23 01:44:26.761463 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:26.761418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-55jcs_03acfc21-c63d-4a37-b1b3-f487cc972611/kube-state-metrics/0.log" Apr 23 01:44:26.784547 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:26.784479 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-55jcs_03acfc21-c63d-4a37-b1b3-f487cc972611/kube-rbac-proxy-main/0.log" Apr 23 01:44:26.803632 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:26.803609 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-55jcs_03acfc21-c63d-4a37-b1b3-f487cc972611/kube-rbac-proxy-self/0.log" Apr 23 01:44:26.829053 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:26.829013 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-99b8b9c45-v9cx5_f682a177-73b0-458a-b735-eb6e0efa3d72/metrics-server/0.log" Apr 23 01:44:26.857161 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:26.857130 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-c72ld_e2094d4c-7dd1-451a-9a12-cc12a7b7d147/monitoring-plugin/0.log" Apr 23 01:44:26.982970 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:26.982944 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lxvg2_fb4cc752-0a5d-474a-a7a1-1d98c0428cd9/node-exporter/0.log" Apr 23 01:44:27.010260 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.010231 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lxvg2_fb4cc752-0a5d-474a-a7a1-1d98c0428cd9/kube-rbac-proxy/0.log" Apr 23 01:44:27.032912 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.032889 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lxvg2_fb4cc752-0a5d-474a-a7a1-1d98c0428cd9/init-textfile/0.log" Apr 23 01:44:27.135612 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.135512 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xl42d_0a34f6c3-dd01-4b57-8b91-d896ec90dd46/kube-rbac-proxy-main/0.log" Apr 23 01:44:27.155102 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.155073 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xl42d_0a34f6c3-dd01-4b57-8b91-d896ec90dd46/kube-rbac-proxy-self/0.log" Apr 23 01:44:27.175515 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.175479 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xl42d_0a34f6c3-dd01-4b57-8b91-d896ec90dd46/openshift-state-metrics/0.log" Apr 23 01:44:27.547584 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.547559 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57bf8c9cc7-h64hp_6110d3d3-583d-4603-9c3a-98306774315e/thanos-query/0.log" Apr 23 01:44:27.566483 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.566459 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57bf8c9cc7-h64hp_6110d3d3-583d-4603-9c3a-98306774315e/kube-rbac-proxy-web/0.log" Apr 23 01:44:27.585583 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.585558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57bf8c9cc7-h64hp_6110d3d3-583d-4603-9c3a-98306774315e/kube-rbac-proxy/0.log" Apr 23 01:44:27.604350 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.604322 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57bf8c9cc7-h64hp_6110d3d3-583d-4603-9c3a-98306774315e/prom-label-proxy/0.log" Apr 23 01:44:27.622742 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.622721 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57bf8c9cc7-h64hp_6110d3d3-583d-4603-9c3a-98306774315e/kube-rbac-proxy-rules/0.log" Apr 23 01:44:27.644463 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:27.644440 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57bf8c9cc7-h64hp_6110d3d3-583d-4603-9c3a-98306774315e/kube-rbac-proxy-metrics/0.log" Apr 23 01:44:29.369786 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.369748 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch"] Apr 23 01:44:29.373293 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.373276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.375628 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.375604 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6hxp5\"/\"kube-root-ca.crt\"" Apr 23 01:44:29.375628 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.375625 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6hxp5\"/\"openshift-service-ca.crt\"" Apr 23 01:44:29.375801 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.375642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6hxp5\"/\"default-dockercfg-rb8bq\"" Apr 23 01:44:29.379785 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.379757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch"] Apr 23 01:44:29.456797 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.456754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-lib-modules\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.457011 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.456841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-sys\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.457011 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.456889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-podres\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.457011 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.456931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwnp8\" (UniqueName: \"kubernetes.io/projected/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-kube-api-access-nwnp8\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.457011 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.456997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-proc\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558387 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-sys\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-podres\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwnp8\" (UniqueName: \"kubernetes.io/projected/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-kube-api-access-nwnp8\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-proc\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-lib-modules\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558497 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-sys\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558778 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-proc\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558778 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-podres\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.558778 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.558623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-lib-modules\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.567322 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.567285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwnp8\" (UniqueName: \"kubernetes.io/projected/2bcc0e72-6dc2-43ec-a4e7-28eb354605a6-kube-api-access-nwnp8\") pod \"perf-node-gather-daemonset-shhch\" (UID: \"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.685679 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.685581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:29.810920 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.810762 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch"] Apr 23 01:44:29.813622 ip-10-0-137-21 kubenswrapper[2576]: W0423 01:44:29.813593 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2bcc0e72_6dc2_43ec_a4e7_28eb354605a6.slice/crio-3bea2556c32e9c7fb2895bd48e7bf613f19881928df4469b64c8e75db087ff59 WatchSource:0}: Error finding container 3bea2556c32e9c7fb2895bd48e7bf613f19881928df4469b64c8e75db087ff59: Status 404 returned error can't find the container with id 3bea2556c32e9c7fb2895bd48e7bf613f19881928df4469b64c8e75db087ff59 Apr 23 01:44:29.815637 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.815614 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:44:29.843122 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:29.843091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" event={"ID":"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6","Type":"ContainerStarted","Data":"3bea2556c32e9c7fb2895bd48e7bf613f19881928df4469b64c8e75db087ff59"} Apr 23 01:44:30.848287 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:30.848247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" event={"ID":"2bcc0e72-6dc2-43ec-a4e7-28eb354605a6","Type":"ContainerStarted","Data":"9049947a1a88746dcdec8e31a61012a551a1d8975929e341096995503eec23ca"} Apr 23 01:44:30.848687 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:30.848353 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:30.865584 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:30.865534 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" podStartSLOduration=1.86552067 podStartE2EDuration="1.86552067s" podCreationTimestamp="2026-04-23 01:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:44:30.863751444 +0000 UTC m=+2070.038547076" watchObservedRunningTime="2026-04-23 01:44:30.86552067 +0000 UTC m=+2070.040316358" Apr 23 01:44:31.464587 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:31.464557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kdm6b_ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d/dns/0.log" Apr 23 01:44:31.486667 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:31.486642 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kdm6b_ac6cb0d5-6b9b-4b9e-8aa0-141ff0a55a4d/kube-rbac-proxy/0.log" Apr 23 01:44:31.574687 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:31.574655 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mggnc_13504342-083f-4f36-abd4-a1b6558edb3f/dns-node-resolver/0.log" Apr 23 01:44:32.089072 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:32.089027 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7nslb_b0608d8b-418d-4240-b84e-dc09071f45b7/node-ca/0.log" Apr 23 01:44:33.122699 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:33.122661 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fjhtp_2612f981-16aa-4f30-9b65-9ccad11b8949/discovery/0.log" Apr 23 01:44:33.172690 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:33.172660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-8596599875-qbwwc_e3a1be14-d458-4b2d-ab99-162e9f69c4ad/kube-auth-proxy/0.log" Apr 23 01:44:33.798861 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:33.798819 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kgfvc_16cbb3a1-0ddd-4793-8d37-07bfa2a0568d/serve-healthcheck-canary/0.log" Apr 23 01:44:34.414916 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:34.414887 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kz8x2_e96552f1-4f6d-44b8-9597-3fe4faed4577/kube-rbac-proxy/0.log" Apr 23 01:44:34.434238 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:34.434213 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kz8x2_e96552f1-4f6d-44b8-9597-3fe4faed4577/exporter/0.log" Apr 23 01:44:34.454197 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:34.454168 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kz8x2_e96552f1-4f6d-44b8-9597-3fe4faed4577/extractor/0.log" Apr 23 01:44:36.324656 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:36.324619 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-ffwjb_601a9f2f-90f1-4817-93d6-7215a72f670e/manager/0.log" Apr 23 01:44:36.358880 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:36.358845 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6d754fdd4d-jv4nj_f03e77fa-3299-4f51-9eac-ef078bb6754e/maas-api/0.log" Apr 23 01:44:36.411822 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:36.411789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7ddc769ff4-gh2mj_b7ec086f-0602-4318-8731-5b090e77f4cb/manager/0.log" Apr 23 01:44:36.430256 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:36.430222 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-pmk8s_86503cb2-857e-4924-af95-6f47288ea75d/manager/0.log" Apr 23 01:44:36.442211 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:36.442184 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-pmk8s_86503cb2-857e-4924-af95-6f47288ea75d/manager/1.log" Apr 23 01:44:36.541182 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:36.541152 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5fb5768b86-sk657_11222ec8-cf6a-464c-933d-8706b7de04c5/manager/0.log" Apr 23 01:44:36.561349 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:36.561314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-gh2kz_6881d7a2-e97b-4c63-af58-69de860a39b6/postgres/0.log" Apr 23 01:44:36.862065 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:36.862039 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-shhch" Apr 23 01:44:42.174359 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:42.174323 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7hplg_99c1f9db-3073-4152-9221-8661b8a6f579/migrator/0.log" Apr 23 01:44:42.193552 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:42.193521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7hplg_99c1f9db-3073-4152-9221-8661b8a6f579/graceful-termination/0.log" Apr 23 01:44:43.824024 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:43.823965 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n9nhd_bf4044d9-01da-465d-a2bf-80556b56473d/kube-multus-additional-cni-plugins/0.log" Apr 23 01:44:43.843648 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:43.843624 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n9nhd_bf4044d9-01da-465d-a2bf-80556b56473d/egress-router-binary-copy/0.log" Apr 23 01:44:43.867321 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:43.867290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n9nhd_bf4044d9-01da-465d-a2bf-80556b56473d/cni-plugins/0.log" Apr 23 01:44:43.887167 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:43.887139 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n9nhd_bf4044d9-01da-465d-a2bf-80556b56473d/bond-cni-plugin/0.log" Apr 23 01:44:43.907035 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:43.906974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n9nhd_bf4044d9-01da-465d-a2bf-80556b56473d/routeoverride-cni/0.log" Apr 23 01:44:43.926301 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:43.926274 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n9nhd_bf4044d9-01da-465d-a2bf-80556b56473d/whereabouts-cni-bincopy/0.log" Apr 23 01:44:43.945150 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:43.945121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n9nhd_bf4044d9-01da-465d-a2bf-80556b56473d/whereabouts-cni/0.log" Apr 23 01:44:44.008333 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:44.008303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5qrx_3fe8791c-b433-4e16-983b-0550aa4d2b4d/kube-multus/0.log" Apr 23 01:44:44.093190 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:44.093112 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-755gj_4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554/network-metrics-daemon/0.log" Apr 23 01:44:44.110713 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:44.110687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-755gj_4fa93c6a-0bcb-4e1d-abf9-ae8f8aa58554/kube-rbac-proxy/0.log" Apr 23 01:44:45.278199 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.278171 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-controller/0.log" Apr 23 01:44:45.296557 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.296535 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/0.log" Apr 23 01:44:45.306272 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.306250 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovn-acl-logging/1.log" Apr 23 01:44:45.326131 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.326103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/kube-rbac-proxy-node/0.log" Apr 23 01:44:45.345202 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.345174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 01:44:45.362609 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.362579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/northd/0.log" Apr 23 01:44:45.384064 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.384040 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/nbdb/0.log" Apr 23 01:44:45.403184 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.403150 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/sbdb/0.log" Apr 23 01:44:45.499791 ip-10-0-137-21 kubenswrapper[2576]: I0423 01:44:45.499761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rfgs_5c95cbe5-ed9d-499f-b53a-66de0e3475e6/ovnkube-controller/0.log"