Apr 16 18:17:46.099381 ip-10-0-128-74 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:17:46.557652 ip-10-0-128-74 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:46.557652 ip-10-0-128-74 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:17:46.557652 ip-10-0-128-74 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:46.557652 ip-10-0-128-74 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:17:46.557652 ip-10-0-128-74 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:46.559284 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.559194 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:17:46.563514 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563496 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:46.563514 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563513 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:46.563514 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563517 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563521 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563525 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563528 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563531 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563533 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563536 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563539 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563542 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563545 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563552 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563556 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563559 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563562 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563564 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563567 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563571 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563575 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563579 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:46.563610 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563582 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563585 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563588 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563592 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563597 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563601 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563604 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563606 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563609 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563612 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563615 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563618 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563620 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563623 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563626 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563629 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563632 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563634 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563637 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563640 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:46.564076 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563642 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563645 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563647 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563650 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563653 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563655 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563658 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563660 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563664 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563667 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563669 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563671 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563674 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563678 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563686 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563689 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563692 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563694 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563697 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563700 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:46.564580 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563702 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563705 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563708 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563710 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563713 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563715 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563718 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563720 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563723 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563726 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563728 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563731 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563734 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563736 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563739 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563742 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563744 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563747 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563749 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563752 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:46.565091 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563754 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563757 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563760 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563762 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.563765 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564206 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564213 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564216 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564219 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564221 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564224 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564227 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564230 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564232 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564235 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564237 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564258 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564262 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564265 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:46.565574 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564268 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564271 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564274 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564277 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564280 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564287 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564290 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564293 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564296 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564298 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564301 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564304 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564306 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564309 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564312 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564315 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564318 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564320 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564323 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564326 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:46.566069 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564328 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564332 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564336 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564339 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564342 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564345 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564347 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564350 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564352 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564355 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564357 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564360 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564363 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564365 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564369 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564371 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564374 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564376 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564382 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564384 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:46.566553 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564387 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564389 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564392 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564395 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564397 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564400 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564403 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564406 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564409 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564411 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564414 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564416 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564419 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564422 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564424 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564427 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564431 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564434 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564437 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:46.567103 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564440 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564443 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564447 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564450 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564452 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564455 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564458 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564461 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564463 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564466 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564469 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564471 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.564475 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564543 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564554 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564573 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564580 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564587 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564591 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564598 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564604 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:17:46.567564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564607 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564610 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564614 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564617 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564621 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564624 2570 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564627 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564630 2570 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564633 2570 flags.go:64] FLAG: --cloud-config="" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564636 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564639 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564646 2570 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564649 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564652 2570 flags.go:64] FLAG: --config-dir="" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564655 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564659 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564663 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564666 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564669 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564674 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564677 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564680 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564683 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564686 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564695 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:17:46.568097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564700 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564703 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564706 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564709 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564712 2570 flags.go:64] FLAG: --enable-server="true" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564716 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564724 2570 flags.go:64] FLAG: --event-burst="100" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564727 2570 flags.go:64] FLAG: --event-qps="50" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564730 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564733 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564736 2570 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564740 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564743 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564746 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564749 2570 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564752 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564755 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564760 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564763 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564766 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564769 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564772 2570 flags.go:64] FLAG: --feature-gates="" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564776 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564779 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564782 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:17:46.568776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564785 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564790 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564793 2570 flags.go:64] FLAG: --help="false" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564796 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564799 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564802 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564805 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564809 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564813 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564816 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564819 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564822 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564826 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564829 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564832 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564835 2570 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564838 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564841 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564844 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564847 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564850 2570 flags.go:64] FLAG: --lock-file="" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564852 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564855 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564859 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:17:46.569405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564864 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564867 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564870 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564873 2570 flags.go:64] FLAG: --logging-format="text" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564876 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564880 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564882 2570 flags.go:64] FLAG: --manifest-url="" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564885 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564890 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564894 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564898 2570 flags.go:64] FLAG: --max-pods="110" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564901 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564904 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564907 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564910 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564913 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564917 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564920 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564928 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564931 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564936 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564939 2570 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564942 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:17:46.569970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564948 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564951 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564954 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564957 2570 flags.go:64] FLAG: --port="10250" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564961 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564963 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01d5c1c5fe0c00a27" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564967 2570 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564970 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564973 2570 flags.go:64] FLAG: --register-node="true" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564976 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564979 2570 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564986 2570 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564989 2570 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564992 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564995 2570 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.564998 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565001 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565004 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565009 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565012 2570 flags.go:64] FLAG: --runonce="false" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565015 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565018 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565021 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565024 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565027 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565030 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:17:46.570525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565038 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565041 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565044 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565047 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565064 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565069 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565074 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565079 2570 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565082 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565087 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565090 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565094 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565101 2570 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565104 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565107 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565109 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565112 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565116 2570 flags.go:64] FLAG: --v="2" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565120 2570 flags.go:64] FLAG: --version="false" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565128 2570 flags.go:64] FLAG: --vmodule="" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565133 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565136 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565235 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565239 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:46.571179 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565244 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565249 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565253 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565256 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565259 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565262 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565265 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565268 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565271 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565274 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565276 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565279 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565283 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565286 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565289 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565292 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565294 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565297 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565299 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565302 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:46.571772 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565304 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565307 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565310 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565312 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565315 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565317 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565320 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565322 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565325 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565328 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565330 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565333 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565337 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565340 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565344 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565347 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565350 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565353 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565356 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:46.572615 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565359 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565361 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565364 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565367 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565370 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565373 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565376 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565379 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565381 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565384 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565386 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565389 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565392 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565394 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565396 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565399 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565401 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565404 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565406 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:46.573141 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565409 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565411 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565414 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565417 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565419 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565421 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565425 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565428 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565431 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565433 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565436 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565438 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565441 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565444 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565446 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565449 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565452 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565455 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565459 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565462 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:46.573619 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565464 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:46.574129 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565467 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:46.574129 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565469 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:46.574129 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565472 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:46.574129 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565475 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:46.574129 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.565477 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:46.574129 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.565488 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:46.574129 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.574111 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:17:46.574129 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.574131 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574183 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574188 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574191 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574195 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574198 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574201 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574204 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574207 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574210 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574213 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574216 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574218 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574221 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574224 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574227 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574229 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574232 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574235 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574238 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:46.574344 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574241 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574243 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574246 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574249 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574252 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574254 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574257 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574259 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574262 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574265 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574267 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574270 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574273 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574276 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574279 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574282 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574284 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574287 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574290 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:46.574847 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574293 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574295 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574298 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574300 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574303 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574306 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574308 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574311 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574314 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574317 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574319 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574322 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574325 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574327 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574330 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574332 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574335 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574337 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574340 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574342 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:46.575333 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574345 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574348 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574351 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574354 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574357 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574359 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574362 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574365 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574368 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574371 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574373 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574376 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574379 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574382 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574384 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574387 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574390 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574392 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574395 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:46.575808 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574400 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574404 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574408 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574411 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574414 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574418 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574422 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574425 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574428 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.574434 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574542 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574548 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574551 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574554 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574557 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574571 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:46.576322 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574574 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574577 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574580 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574583 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574587 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574590 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574593 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574596 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574600 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574603 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574606 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574609 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574612 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574616 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574620 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574624 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574627 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574630 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574633 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:46.576713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574635 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574638 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574640 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574643 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574645 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574648 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574650 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574653 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574656 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574658 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574661 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574664 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574666 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574669 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574672 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574674 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574677 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574680 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574683 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:46.577205 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574686 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574689 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574691 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574694 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574697 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574700 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574702 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574705 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574708 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574710 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574713 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574716 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574719 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574722 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574724 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574727 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574729 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574732 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574734 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574737 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:46.577649 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574740 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574742 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574745 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574748 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574750 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574753 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574756 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574758 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574763 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574766 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574769 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574772 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574775 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574778 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574781 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574784 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574786 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574789 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574793 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574795 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:46.578145 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574798 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:46.578637 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:46.574800 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:46.578637 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.574805 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:46.578637 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.575587 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:17:46.578637 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.577654 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:17:46.578851 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.578837 2570 server.go:1019] "Starting client certificate rotation" Apr 16 18:17:46.578957 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.578939 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:46.578988 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.578981 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:46.605242 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.605219 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:46.607604 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.607582 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:46.624299 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.624276 2570 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:17:46.630095 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.630078 2570 log.go:25] "Validated CRI v1 image API" Apr 16 18:17:46.631380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.631354 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:17:46.641631 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.641610 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:46.643629 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.643597 2570 fs.go:135] Filesystem UUIDs: map[22e6f323-a93b-42a7-92fd-c1c743bbb3cd:/dev/nvme0n1p4 6122bdda-6783-4bf7-a9a1-3dd1c4db27cf:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:17:46.643697 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.643628 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:17:46.649728 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.649609 2570 manager.go:217] Machine: {Timestamp:2026-04-16 18:17:46.647600839 +0000 UTC m=+0.427065162 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3110177 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec276915ef6eb546fbb0bb0a147d6582 SystemUUID:ec276915-ef6e-b546-fbb0-bb0a147d6582 BootID:b076e24f-3139-4387-b597-d2e5aac8e95c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fa:f4:b6:cc:5d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fa:f4:b6:cc:5d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:b1:95:4c:5c:9d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:17:46.649728 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.649722 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:17:46.649831 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.649811 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:17:46.650162 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.650141 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:17:46.650317 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.650164 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-74.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:17:46.650363 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.650326 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:17:46.650363 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.650335 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:17:46.650363 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.650348 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:46.650363 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.650361 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:46.652230 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.652218 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:46.652335 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.652325 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:17:46.654866 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.654855 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:17:46.654904 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.654869 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:17:46.654904 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.654881 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:17:46.654904 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.654890 2570 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:17:46.654904 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.654900 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:17:46.656193 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.656180 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:46.656249 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.656199 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:46.659486 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.659462 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:17:46.661845 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.661831 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:17:46.663271 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663257 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:17:46.663271 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663273 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663279 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663284 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663290 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663297 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663304 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663309 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663316 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663322 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663333 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:17:46.663381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663342 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:17:46.663848 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.663832 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t4j86" Apr 16 18:17:46.664184 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.664174 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:17:46.664225 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.664186 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:17:46.665074 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.665039 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:17:46.665136 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.665079 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-74.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:17:46.667556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.667541 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-74.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:17:46.667838 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.667826 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:17:46.667883 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.667863 2570 server.go:1295] "Started kubelet" Apr 16 18:17:46.667959 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.667934 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:17:46.668007 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.667966 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:17:46.668040 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.668031 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:17:46.668721 ip-10-0-128-74 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:17:46.670461 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.670449 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:17:46.671966 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.671947 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:17:46.672454 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.672438 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t4j86" Apr 16 18:17:46.675732 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.674710 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-74.ec2.internal.18a6e9294eb7cf5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-74.ec2.internal,UID:ip-10-0-128-74.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-74.ec2.internal,},FirstTimestamp:2026-04-16 18:17:46.667839325 +0000 UTC m=+0.447303649,LastTimestamp:2026-04-16 18:17:46.667839325 +0000 UTC m=+0.447303649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-74.ec2.internal,}" Apr 16 18:17:46.678104 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.678085 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:17:46.678367 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.678353 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:46.678866 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.678850 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:17:46.681348 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.680697 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:17:46.681348 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.680850 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:17:46.681348 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.681221 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:17:46.681348 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.681230 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:17:46.681348 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.681239 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:46.681632 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.681420 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:17:46.682293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.681859 2570 factory.go:55] Registering systemd factory Apr 16 18:17:46.682293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.681884 2570 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:17:46.682293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.682123 2570 factory.go:153] Registering CRI-O factory Apr 16 18:17:46.682293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.682139 2570 factory.go:223] Registration of the crio container factory successfully Apr 16 18:17:46.682293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.682200 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:17:46.682293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.682224 2570 factory.go:103] Registering Raw factory Apr 16 18:17:46.682293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.682246 2570 manager.go:1196] Started watching for new ooms in manager Apr 16 18:17:46.682707 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.682693 2570 manager.go:319] Starting recovery of all containers Apr 16 18:17:46.683945 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.683924 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:46.688724 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.688561 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-74.ec2.internal\" not found" node="ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.693929 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.693911 2570 manager.go:324] Recovery completed Apr 16 18:17:46.698345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.698331 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:46.701220 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.701202 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:46.701312 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.701244 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:46.701312 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.701255 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:46.701787 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.701774 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:17:46.701787 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.701786 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:17:46.701877 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.701802 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:46.703885 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.703872 2570 policy_none.go:49] "None policy: Start" Apr 16 18:17:46.703961 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.703888 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:17:46.703961 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.703898 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.742534 2570 manager.go:341] "Starting Device Plugin manager" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.742573 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.742584 2570 server.go:85] "Starting device plugin registration server" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.742859 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.742869 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.742967 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.743080 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.743116 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.743715 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:17:46.754417 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.743764 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:46.832209 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.832120 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:17:46.833359 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.833343 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:17:46.833420 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.833369 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:17:46.833420 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.833413 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:17:46.833499 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.833421 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:17:46.833499 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.833461 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:17:46.837616 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.837595 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:46.843676 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.843663 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:46.844721 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.844708 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:46.844777 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.844736 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:46.844777 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.844748 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:46.844777 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.844769 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.853131 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.853117 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.853195 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.853141 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-74.ec2.internal\": node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:46.864817 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.864797 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:46.934084 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.934024 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal"] Apr 16 18:17:46.934202 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.934135 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:46.935095 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.935079 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:46.935194 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.935111 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:46.935194 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.935124 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:46.936792 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.936777 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:46.936987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.936971 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.937041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.937001 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:46.937613 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.937594 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:46.937700 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.937621 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:46.937700 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.937632 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:46.937700 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.937597 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:46.937700 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.937699 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:46.937823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.937710 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:46.939497 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.939481 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.939578 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.939519 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:46.940220 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.940206 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:46.940315 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.940236 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:46.940315 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.940249 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:46.958732 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.958709 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-74.ec2.internal\" not found" node="ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.964403 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.964381 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-74.ec2.internal\" not found" node="ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.965272 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:46.965247 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:46.983257 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.983231 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d8fd7e10da2386e75452a7c51276b1d0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal\" (UID: \"d8fd7e10da2386e75452a7c51276b1d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.983369 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.983258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8fd7e10da2386e75452a7c51276b1d0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal\" (UID: \"d8fd7e10da2386e75452a7c51276b1d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:46.983369 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:46.983281 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4d5b4fccf018b349362d7b27ad7bd6e5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-74.ec2.internal\" (UID: \"4d5b4fccf018b349362d7b27ad7bd6e5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.065637 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.065601 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.083617 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.083557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d8fd7e10da2386e75452a7c51276b1d0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal\" (UID: \"d8fd7e10da2386e75452a7c51276b1d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.083617 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.083587 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8fd7e10da2386e75452a7c51276b1d0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal\" (UID: \"d8fd7e10da2386e75452a7c51276b1d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.083617 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.083606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4d5b4fccf018b349362d7b27ad7bd6e5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-74.ec2.internal\" (UID: \"4d5b4fccf018b349362d7b27ad7bd6e5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.083752 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.083680 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4d5b4fccf018b349362d7b27ad7bd6e5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-74.ec2.internal\" (UID: \"4d5b4fccf018b349362d7b27ad7bd6e5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.083752 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.083678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d8fd7e10da2386e75452a7c51276b1d0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal\" (UID: \"d8fd7e10da2386e75452a7c51276b1d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.083752 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.083679 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8fd7e10da2386e75452a7c51276b1d0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal\" (UID: \"d8fd7e10da2386e75452a7c51276b1d0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.165714 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.165668 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.263350 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.263306 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.266292 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.266274 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.267389 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.267372 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" Apr 16 18:17:47.367152 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.367046 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.467617 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.467582 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.568208 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.568166 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.578339 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.578311 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:17:47.578470 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.578453 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:47.578509 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.578479 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:47.668319 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.668287 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.674196 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.674163 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:12:46 +0000 UTC" deadline="2027-09-22 22:24:43.258782354 +0000 UTC" Apr 16 18:17:47.674196 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.674192 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12580h6m55.584593059s" Apr 16 18:17:47.679128 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.679111 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:47.692891 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.692870 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:47.744030 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.744002 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kmchv" Apr 16 18:17:47.750119 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.750098 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kmchv" Apr 16 18:17:47.754684 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:47.754658 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5b4fccf018b349362d7b27ad7bd6e5.slice/crio-cefe25ae52cf4f729cd1d20aaa7cd0059e2ee7417c6ede9ab37d5dd959ee5778 WatchSource:0}: Error finding container cefe25ae52cf4f729cd1d20aaa7cd0059e2ee7417c6ede9ab37d5dd959ee5778: Status 404 returned error can't find the container with id cefe25ae52cf4f729cd1d20aaa7cd0059e2ee7417c6ede9ab37d5dd959ee5778 Apr 16 18:17:47.755047 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:47.755029 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8fd7e10da2386e75452a7c51276b1d0.slice/crio-8a2ab1bce79077de763a45bcb3ccc5d674a38cd367ebe7a991992af13d0f4c0d WatchSource:0}: Error finding container 8a2ab1bce79077de763a45bcb3ccc5d674a38cd367ebe7a991992af13d0f4c0d: Status 404 returned error can't find the container with id 8a2ab1bce79077de763a45bcb3ccc5d674a38cd367ebe7a991992af13d0f4c0d Apr 16 18:17:47.759469 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.759456 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:47.769234 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.769217 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.836890 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.836835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" event={"ID":"d8fd7e10da2386e75452a7c51276b1d0","Type":"ContainerStarted","Data":"8a2ab1bce79077de763a45bcb3ccc5d674a38cd367ebe7a991992af13d0f4c0d"} Apr 16 18:17:47.837689 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.837669 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" event={"ID":"4d5b4fccf018b349362d7b27ad7bd6e5","Type":"ContainerStarted","Data":"cefe25ae52cf4f729cd1d20aaa7cd0059e2ee7417c6ede9ab37d5dd959ee5778"} Apr 16 18:17:47.869832 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.869808 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:47.944266 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:47.944209 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:47.970411 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:47.970385 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:48.071000 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.070967 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:48.171894 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.171860 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-74.ec2.internal\" not found" Apr 16 18:17:48.212350 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.212276 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:48.279719 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.279691 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" Apr 16 18:17:48.291266 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.291239 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:48.292302 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.292279 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" Apr 16 18:17:48.302009 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.301984 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:48.527428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.527389 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:48.656047 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.656016 2570 apiserver.go:52] "Watching apiserver" Apr 16 18:17:48.665341 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.665320 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:17:48.665724 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.665699 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6cmxs","openshift-multus/network-metrics-daemon-dvxrp","openshift-cluster-node-tuning-operator/tuned-jrf5b","openshift-dns/node-resolver-2h4fb","openshift-image-registry/node-ca-s2j9l","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal","openshift-multus/multus-additional-cni-plugins-qzpw4","openshift-network-diagnostics/network-check-target-m54zx","openshift-network-operator/iptables-alerter-vchm9","openshift-ovn-kubernetes/ovnkube-node-zps8z","kube-system/konnectivity-agent-rlk99","kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl"] Apr 16 18:17:48.667562 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.667546 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:48.668194 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.668167 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.669573 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.669553 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:48.669670 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.669645 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:17:48.670707 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.670680 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:17:48.670914 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.670897 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:17:48.671106 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.671086 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:17:48.671204 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.671111 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:17:48.671204 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.671128 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:17:48.671330 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.671280 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cshbk\"" Apr 16 18:17:48.672134 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.672114 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.673383 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.673365 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.675742 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.674611 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:48.675742 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.674676 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:48.675742 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.674775 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-j2zj2\"" Apr 16 18:17:48.676156 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.676132 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qhm2v\"" Apr 16 18:17:48.676504 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.676391 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:17:48.676621 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.676604 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:17:48.677043 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.677018 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.678683 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.678661 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.679205 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.679187 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:48.679318 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.679275 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:17:48.679399 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.679351 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rf6nn\"" Apr 16 18:17:48.679627 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.679607 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:17:48.680900 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.680823 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.680992 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.680903 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:17:48.681449 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.681432 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5n5rs\"" Apr 16 18:17:48.681559 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.681436 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:17:48.681777 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.681761 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:17:48.682521 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.682501 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.683184 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.683164 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:17:48.683295 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.683172 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:48.683381 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.683254 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:48.683459 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.683268 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-v79tc\"" Apr 16 18:17:48.683795 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.683772 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:17:48.684796 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.684777 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:17:48.685149 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.685123 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.685264 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.685212 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:17:48.685322 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.685264 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:17:48.685322 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.685212 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:17:48.686167 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.686147 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:17:48.686403 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.686384 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:17:48.686496 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.686449 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:17:48.686613 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.686494 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gjs6b\"" Apr 16 18:17:48.686613 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.686453 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:17:48.686613 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.686588 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ql4vk\"" Apr 16 18:17:48.687426 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.687399 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:17:48.687660 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.687642 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:17:48.687735 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.687688 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:17:48.687735 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.687705 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7w8xx\"" Apr 16 18:17:48.692924 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.692902 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-ovn\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.693026 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.692948 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4w2p\" (UniqueName: \"kubernetes.io/projected/a9622aca-ffc8-4b50-82e0-a1c82e6222df-kube-api-access-k4w2p\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.693026 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.692977 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-kubelet\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.693026 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-run\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.693202 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693031 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-tuned\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.693202 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693071 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4735317d-b557-4ca9-84cd-02f72096e33a-tmp-dir\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.693202 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693116 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.693202 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693145 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-env-overrides\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.693202 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693181 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-tmp\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693208 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx8mr\" (UniqueName: \"kubernetes.io/projected/4735317d-b557-4ca9-84cd-02f72096e33a-kube-api-access-wx8mr\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693236 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9622aca-ffc8-4b50-82e0-a1c82e6222df-serviceca\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693262 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-cni-bin\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693302 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693322 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-systemd\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-system-cni-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693365 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysctl-d\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693406 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-run-netns\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693443 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-var-lib-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693480 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-cni-bin\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.693533 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693504 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovn-node-metrics-cert\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693545 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-cni-multus\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693575 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-conf-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693605 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693631 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-cnibin\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693647 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693664 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693683 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-var-lib-kubelet\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693705 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-etc-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693728 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-log-socket\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-systemd-units\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693839 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-cnibin\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b45981e-9576-4b1b-b941-35f68d109c84-cni-binary-copy\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693953 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-os-release\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.694002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.693988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bfk4\" (UniqueName: \"kubernetes.io/projected/a3821f1e-3cf4-4526-9175-97c1251899f2-kube-api-access-9bfk4\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694018 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-systemd\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v48lm\" (UniqueName: \"kubernetes.io/projected/1b45981e-9576-4b1b-b941-35f68d109c84-kube-api-access-v48lm\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694082 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-k8s-cni-cncf-io\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-netns\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysconfig\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694228 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694283 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-hostroot\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694319 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b45981e-9576-4b1b-b941-35f68d109c84-multus-daemon-config\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694346 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77rpm\" (UniqueName: \"kubernetes.io/projected/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-kube-api-access-77rpm\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694377 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbprn\" (UniqueName: \"kubernetes.io/projected/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-kube-api-access-zbprn\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694471 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-modprobe-d\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-node-log\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694543 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-system-cni-dir\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-lib-modules\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.694686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694601 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-run-ovn-kubernetes\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694625 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovnkube-config\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694661 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-host-slash\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694685 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694730 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-os-release\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694760 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-socket-dir-parent\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694827 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-etc-kubernetes\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694871 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-host\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694898 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-iptables-alerter-script\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694922 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysctl-conf\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694944 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-kubelet\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694967 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-cni-netd\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.694996 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovnkube-script-lib\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695019 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-kubernetes\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22n72\" (UniqueName: \"kubernetes.io/projected/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-kube-api-access-22n72\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695100 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47qmp\" (UniqueName: \"kubernetes.io/projected/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-kube-api-access-47qmp\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:48.695380 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695135 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9622aca-ffc8-4b50-82e0-a1c82e6222df-host\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.695993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695161 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-cni-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.695993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695186 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-multus-certs\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.695993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695208 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-sys\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.695993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695234 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4735317d-b557-4ca9-84cd-02f72096e33a-hosts-file\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.695993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.695256 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-slash\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.750790 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.750757 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:47 +0000 UTC" deadline="2028-01-15 12:02:17.761020314 +0000 UTC" Apr 16 18:17:48.750790 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.750789 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15329h44m29.010235767s" Apr 16 18:17:48.783163 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.783078 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:17:48.796172 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-k8s-cni-cncf-io\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.796345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-netns\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.796345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796205 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysconfig\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.796345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796232 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.796345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796248 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-k8s-cni-cncf-io\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.796345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.796345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796302 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-netns\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.796345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysconfig\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796330 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-hostroot\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-hostroot\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b45981e-9576-4b1b-b941-35f68d109c84-multus-daemon-config\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77rpm\" (UniqueName: \"kubernetes.io/projected/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-kube-api-access-77rpm\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbprn\" (UniqueName: \"kubernetes.io/projected/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-kube-api-access-zbprn\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796470 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-modprobe-d\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-node-log\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796540 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0aa834c9-7b5e-44dc-a706-cf8d7ff11391-konnectivity-ca\") pod \"konnectivity-agent-rlk99\" (UID: \"0aa834c9-7b5e-44dc-a706-cf8d7ff11391\") " pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796562 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-system-cni-dir\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796608 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-lib-modules\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-run-ovn-kubernetes\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.796635 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796629 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-node-log\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovnkube-config\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-device-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796679 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-modprobe-d\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796698 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-host-slash\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796711 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-run-ovn-kubernetes\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-host-slash\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796759 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-lib-modules\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-os-release\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-system-cni-dir\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.796823 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-socket-dir-parent\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-socket-dir-parent\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-os-release\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796869 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-etc-kubernetes\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796903 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-etc-kubernetes\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.797262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796910 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-host\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.796944 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:17:49.296922161 +0000 UTC m=+3.076386473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-host\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796964 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-iptables-alerter-script\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.796989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysctl-conf\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-kubelet\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797035 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-cni-netd\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797094 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-cni-netd\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797098 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-kubelet\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797135 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovnkube-script-lib\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-kubernetes\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797170 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797185 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysctl-conf\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797193 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22n72\" (UniqueName: \"kubernetes.io/projected/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-kube-api-access-22n72\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797233 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-socket-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovnkube-config\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797260 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-registration-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-kubernetes\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797288 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-sys-fs\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797318 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47qmp\" (UniqueName: \"kubernetes.io/projected/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-kube-api-access-47qmp\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9622aca-ffc8-4b50-82e0-a1c82e6222df-host\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-cni-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-multus-certs\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797514 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-iptables-alerter-script\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-sys\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797553 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-cni-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797560 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-run-multus-certs\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797574 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9622aca-ffc8-4b50-82e0-a1c82e6222df-host\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4735317d-b557-4ca9-84cd-02f72096e33a-hosts-file\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-slash\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797634 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovnkube-script-lib\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797644 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-ovn\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-sys\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797671 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-slash\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.798747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4w2p\" (UniqueName: \"kubernetes.io/projected/a9622aca-ffc8-4b50-82e0-a1c82e6222df-kube-api-access-k4w2p\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797731 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-ovn\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-kubelet\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797761 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4735317d-b557-4ca9-84cd-02f72096e33a-hosts-file\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-run\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797810 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b45981e-9576-4b1b-b941-35f68d109c84-multus-daemon-config\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797818 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-tuned\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797812 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-kubelet\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-run\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797851 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4735317d-b557-4ca9-84cd-02f72096e33a-tmp-dir\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797903 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-env-overrides\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797959 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpg5p\" (UniqueName: \"kubernetes.io/projected/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-kube-api-access-qpg5p\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.797984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-tmp\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx8mr\" (UniqueName: \"kubernetes.io/projected/4735317d-b557-4ca9-84cd-02f72096e33a-kube-api-access-wx8mr\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9622aca-ffc8-4b50-82e0-a1c82e6222df-serviceca\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798095 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-cni-bin\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.799556 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798134 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-systemd\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798185 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-cni-bin\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798201 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-systemd\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798235 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-system-cni-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798259 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysctl-d\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798291 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-run-netns\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798355 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-var-lib-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-sysctl-d\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798405 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-cni-bin\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovn-node-metrics-cert\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-cni-bin\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798471 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0aa834c9-7b5e-44dc-a706-cf8d7ff11391-agent-certs\") pod \"konnectivity-agent-rlk99\" (UID: \"0aa834c9-7b5e-44dc-a706-cf8d7ff11391\") " pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-etc-selinux\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-cni-multus\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798534 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-system-cni-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798541 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-conf-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.800258 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798580 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-host-run-netns\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798608 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-cnibin\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798628 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-var-lib-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798632 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798650 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798659 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798671 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-host-var-lib-cni-multus\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798682 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-cnibin\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9622aca-ffc8-4b50-82e0-a1c82e6222df-serviceca\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.798930 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-env-overrides\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4735317d-b557-4ca9-84cd-02f72096e33a-tmp-dir\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799084 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-multus-conf-dir\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-var-lib-kubelet\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799171 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-etc-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-log-socket\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799213 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-systemd-units\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799233 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-cnibin\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b45981e-9576-4b1b-b941-35f68d109c84-cni-binary-copy\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-os-release\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799315 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-systemd-units\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799333 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3821f1e-3cf4-4526-9175-97c1251899f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799361 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-etc-openvswitch\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bfk4\" (UniqueName: \"kubernetes.io/projected/a3821f1e-3cf4-4526-9175-97c1251899f2-kube-api-access-9bfk4\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3821f1e-3cf4-4526-9175-97c1251899f2-os-release\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799396 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-systemd\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799428 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-run-systemd\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799431 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-log-socket\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v48lm\" (UniqueName: \"kubernetes.io/projected/1b45981e-9576-4b1b-b941-35f68d109c84-kube-api-access-v48lm\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b45981e-9576-4b1b-b941-35f68d109c84-cnibin\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-var-lib-kubelet\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.801800 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.799840 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b45981e-9576-4b1b-b941-35f68d109c84-cni-binary-copy\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.802428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.801812 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-etc-tuned\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.802428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.801888 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-tmp\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.802522 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.802475 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-ovn-node-metrics-cert\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.808476 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.808451 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:48.808597 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.808481 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:48.808597 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.808498 2570 projected.go:194] Error preparing data for projected volume kube-api-access-nsd49 for pod openshift-network-diagnostics/network-check-target-m54zx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:48.808597 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:48.808590 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49 podName:ce22102c-2dd2-4a4f-8317-5733e81186d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:49.3085702 +0000 UTC m=+3.088034512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nsd49" (UniqueName: "kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49") pod "network-check-target-m54zx" (UID: "ce22102c-2dd2-4a4f-8317-5733e81186d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:48.809352 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.809318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77rpm\" (UniqueName: \"kubernetes.io/projected/8cc82835-e3e6-46d3-8f2f-ead7027b1b91-kube-api-access-77rpm\") pod \"ovnkube-node-zps8z\" (UID: \"8cc82835-e3e6-46d3-8f2f-ead7027b1b91\") " pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:48.809864 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.809799 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbprn\" (UniqueName: \"kubernetes.io/projected/1f6f728e-b45e-456c-8a7f-87bf91ba5c03-kube-api-access-zbprn\") pod \"iptables-alerter-vchm9\" (UID: \"1f6f728e-b45e-456c-8a7f-87bf91ba5c03\") " pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:48.810224 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.810196 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bfk4\" (UniqueName: \"kubernetes.io/projected/a3821f1e-3cf4-4526-9175-97c1251899f2-kube-api-access-9bfk4\") pod \"multus-additional-cni-plugins-qzpw4\" (UID: \"a3821f1e-3cf4-4526-9175-97c1251899f2\") " pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.810314 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.810297 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47qmp\" (UniqueName: \"kubernetes.io/projected/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-kube-api-access-47qmp\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:48.810867 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.810850 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx8mr\" (UniqueName: \"kubernetes.io/projected/4735317d-b557-4ca9-84cd-02f72096e33a-kube-api-access-wx8mr\") pod \"node-resolver-2h4fb\" (UID: \"4735317d-b557-4ca9-84cd-02f72096e33a\") " pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:48.811182 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.811160 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4w2p\" (UniqueName: \"kubernetes.io/projected/a9622aca-ffc8-4b50-82e0-a1c82e6222df-kube-api-access-k4w2p\") pod \"node-ca-s2j9l\" (UID: \"a9622aca-ffc8-4b50-82e0-a1c82e6222df\") " pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:48.812511 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.812466 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v48lm\" (UniqueName: \"kubernetes.io/projected/1b45981e-9576-4b1b-b941-35f68d109c84-kube-api-access-v48lm\") pod \"multus-6cmxs\" (UID: \"1b45981e-9576-4b1b-b941-35f68d109c84\") " pod="openshift-multus/multus-6cmxs" Apr 16 18:17:48.812600 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.812580 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22n72\" (UniqueName: \"kubernetes.io/projected/fe3b26c2-ed43-4847-9ae9-44c0b6350d49-kube-api-access-22n72\") pod \"tuned-jrf5b\" (UID: \"fe3b26c2-ed43-4847-9ae9-44c0b6350d49\") " pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.900470 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpg5p\" (UniqueName: \"kubernetes.io/projected/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-kube-api-access-qpg5p\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900621 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0aa834c9-7b5e-44dc-a706-cf8d7ff11391-agent-certs\") pod \"konnectivity-agent-rlk99\" (UID: \"0aa834c9-7b5e-44dc-a706-cf8d7ff11391\") " pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:17:48.900621 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-etc-selinux\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900738 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0aa834c9-7b5e-44dc-a706-cf8d7ff11391-konnectivity-ca\") pod \"konnectivity-agent-rlk99\" (UID: \"0aa834c9-7b5e-44dc-a706-cf8d7ff11391\") " pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:17:48.900738 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900667 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900738 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900672 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-etc-selinux\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900738 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900699 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-device-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900738 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900734 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-socket-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900771 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-registration-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-sys-fs\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-device-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-registration-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-socket-dir\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.900983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.900934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-sys-fs\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.901320 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.901183 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0aa834c9-7b5e-44dc-a706-cf8d7ff11391-konnectivity-ca\") pod \"konnectivity-agent-rlk99\" (UID: \"0aa834c9-7b5e-44dc-a706-cf8d7ff11391\") " pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:17:48.903140 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.903119 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0aa834c9-7b5e-44dc-a706-cf8d7ff11391-agent-certs\") pod \"konnectivity-agent-rlk99\" (UID: \"0aa834c9-7b5e-44dc-a706-cf8d7ff11391\") " pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:17:48.922745 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.922718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpg5p\" (UniqueName: \"kubernetes.io/projected/bebba8c7-a52f-4acb-a6b1-afb778d88a5b-kube-api-access-qpg5p\") pod \"aws-ebs-csi-driver-node-jh8xl\" (UID: \"bebba8c7-a52f-4acb-a6b1-afb778d88a5b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:48.982651 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.982621 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" Apr 16 18:17:48.989776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.989735 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" Apr 16 18:17:48.998292 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:48.998270 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2h4fb" Apr 16 18:17:49.005002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.004984 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6cmxs" Apr 16 18:17:49.011518 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.011498 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s2j9l" Apr 16 18:17:49.018035 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.018015 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vchm9" Apr 16 18:17:49.024688 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.024664 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:17:49.031271 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.031252 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:17:49.035864 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.035799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" Apr 16 18:17:49.303701 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.303620 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:49.303851 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:49.303707 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:49.303851 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:49.303780 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:17:50.303761125 +0000 UTC m=+4.083225440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:49.351900 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.351871 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa834c9_7b5e_44dc_a706_cf8d7ff11391.slice/crio-1f163f1e6b25fd1fb148f3e1456c505106f598e5aa7d05aad2bc772915a15e4f WatchSource:0}: Error finding container 1f163f1e6b25fd1fb148f3e1456c505106f598e5aa7d05aad2bc772915a15e4f: Status 404 returned error can't find the container with id 1f163f1e6b25fd1fb148f3e1456c505106f598e5aa7d05aad2bc772915a15e4f Apr 16 18:17:49.353175 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.353153 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe3b26c2_ed43_4847_9ae9_44c0b6350d49.slice/crio-64352d66817705c869376ac26c53cd86bdb7ccde8ca58c51bc44d3dbe58ac693 WatchSource:0}: Error finding container 64352d66817705c869376ac26c53cd86bdb7ccde8ca58c51bc44d3dbe58ac693: Status 404 returned error can't find the container with id 64352d66817705c869376ac26c53cd86bdb7ccde8ca58c51bc44d3dbe58ac693 Apr 16 18:17:49.353975 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.353947 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc82835_e3e6_46d3_8f2f_ead7027b1b91.slice/crio-f3b3617186462935672cd61c3e9775c905ecf5a9167671bca2d649433874a0ed WatchSource:0}: Error finding container f3b3617186462935672cd61c3e9775c905ecf5a9167671bca2d649433874a0ed: Status 404 returned error can't find the container with id f3b3617186462935672cd61c3e9775c905ecf5a9167671bca2d649433874a0ed Apr 16 18:17:49.357585 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.357556 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3821f1e_3cf4_4526_9175_97c1251899f2.slice/crio-c7ffd4230ff1a36d82af23ec764c70aeb3345dd328a91e9cc4ae0dfbd0b3e67d WatchSource:0}: Error finding container c7ffd4230ff1a36d82af23ec764c70aeb3345dd328a91e9cc4ae0dfbd0b3e67d: Status 404 returned error can't find the container with id c7ffd4230ff1a36d82af23ec764c70aeb3345dd328a91e9cc4ae0dfbd0b3e67d Apr 16 18:17:49.359460 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.359170 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebba8c7_a52f_4acb_a6b1_afb778d88a5b.slice/crio-e4d61306c0bd7cc34772637f8004451be3ead5e1698998dfadbc1638aa21a565 WatchSource:0}: Error finding container e4d61306c0bd7cc34772637f8004451be3ead5e1698998dfadbc1638aa21a565: Status 404 returned error can't find the container with id e4d61306c0bd7cc34772637f8004451be3ead5e1698998dfadbc1638aa21a565 Apr 16 18:17:49.360882 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.360863 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f6f728e_b45e_456c_8a7f_87bf91ba5c03.slice/crio-a16d4f040e39f852c05678109cae9cf26672c0c2216fa85fb348380843aadae6 WatchSource:0}: Error finding container a16d4f040e39f852c05678109cae9cf26672c0c2216fa85fb348380843aadae6: Status 404 returned error can't find the container with id a16d4f040e39f852c05678109cae9cf26672c0c2216fa85fb348380843aadae6 Apr 16 18:17:49.361768 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.361745 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b45981e_9576_4b1b_b941_35f68d109c84.slice/crio-b7dcee4a18b90b122994b48e3556ee31d7fc52a5966d49296328e47149bb65f3 WatchSource:0}: Error finding container b7dcee4a18b90b122994b48e3556ee31d7fc52a5966d49296328e47149bb65f3: Status 404 returned error can't find the container with id b7dcee4a18b90b122994b48e3556ee31d7fc52a5966d49296328e47149bb65f3 Apr 16 18:17:49.363166 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.363002 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4735317d_b557_4ca9_84cd_02f72096e33a.slice/crio-2daffb55686eca4f72673513aebadf047d55304ac9e9578c0fca154596ee33ff WatchSource:0}: Error finding container 2daffb55686eca4f72673513aebadf047d55304ac9e9578c0fca154596ee33ff: Status 404 returned error can't find the container with id 2daffb55686eca4f72673513aebadf047d55304ac9e9578c0fca154596ee33ff Apr 16 18:17:49.364086 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:17:49.363798 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9622aca_ffc8_4b50_82e0_a1c82e6222df.slice/crio-013d27f80514b397b152a5460e32a25fd41ae0483bf7ee8df5279f380a991c6f WatchSource:0}: Error finding container 013d27f80514b397b152a5460e32a25fd41ae0483bf7ee8df5279f380a991c6f: Status 404 returned error can't find the container with id 013d27f80514b397b152a5460e32a25fd41ae0483bf7ee8df5279f380a991c6f Apr 16 18:17:49.404182 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.403944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:49.404297 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:49.404104 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:49.404297 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:49.404251 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:49.404297 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:49.404267 2570 projected.go:194] Error preparing data for projected volume kube-api-access-nsd49 for pod openshift-network-diagnostics/network-check-target-m54zx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:49.404443 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:49.404340 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49 podName:ce22102c-2dd2-4a4f-8317-5733e81186d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:50.404320714 +0000 UTC m=+4.183785026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nsd49" (UniqueName: "kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49") pod "network-check-target-m54zx" (UID: "ce22102c-2dd2-4a4f-8317-5733e81186d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:49.752283 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.752079 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:47 +0000 UTC" deadline="2027-12-06 10:08:40.584170709 +0000 UTC" Apr 16 18:17:49.752283 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.752137 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14367h50m50.832038519s" Apr 16 18:17:49.834407 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.834373 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:49.834595 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:49.834514 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:17:49.843073 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.843002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rlk99" event={"ID":"0aa834c9-7b5e-44dc-a706-cf8d7ff11391","Type":"ContainerStarted","Data":"1f163f1e6b25fd1fb148f3e1456c505106f598e5aa7d05aad2bc772915a15e4f"} Apr 16 18:17:49.858887 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.858145 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" event={"ID":"4d5b4fccf018b349362d7b27ad7bd6e5","Type":"ContainerStarted","Data":"481e323912f231833a13e49f6c38b616d34d5ad957bfc6b092a8e8c46c14cf9f"} Apr 16 18:17:49.865216 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.865161 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cmxs" event={"ID":"1b45981e-9576-4b1b-b941-35f68d109c84","Type":"ContainerStarted","Data":"b7dcee4a18b90b122994b48e3556ee31d7fc52a5966d49296328e47149bb65f3"} Apr 16 18:17:49.870460 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.870423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vchm9" event={"ID":"1f6f728e-b45e-456c-8a7f-87bf91ba5c03","Type":"ContainerStarted","Data":"a16d4f040e39f852c05678109cae9cf26672c0c2216fa85fb348380843aadae6"} Apr 16 18:17:49.872900 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.872773 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-74.ec2.internal" podStartSLOduration=1.872757682 podStartE2EDuration="1.872757682s" podCreationTimestamp="2026-04-16 18:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:49.871663307 +0000 UTC m=+3.651127639" watchObservedRunningTime="2026-04-16 18:17:49.872757682 +0000 UTC m=+3.652222014" Apr 16 18:17:49.876396 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.876365 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"f3b3617186462935672cd61c3e9775c905ecf5a9167671bca2d649433874a0ed"} Apr 16 18:17:49.881774 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.881749 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s2j9l" event={"ID":"a9622aca-ffc8-4b50-82e0-a1c82e6222df","Type":"ContainerStarted","Data":"013d27f80514b397b152a5460e32a25fd41ae0483bf7ee8df5279f380a991c6f"} Apr 16 18:17:49.890902 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.890877 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2h4fb" event={"ID":"4735317d-b557-4ca9-84cd-02f72096e33a","Type":"ContainerStarted","Data":"2daffb55686eca4f72673513aebadf047d55304ac9e9578c0fca154596ee33ff"} Apr 16 18:17:49.902088 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.901840 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" event={"ID":"bebba8c7-a52f-4acb-a6b1-afb778d88a5b","Type":"ContainerStarted","Data":"e4d61306c0bd7cc34772637f8004451be3ead5e1698998dfadbc1638aa21a565"} Apr 16 18:17:49.906119 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.906085 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerStarted","Data":"c7ffd4230ff1a36d82af23ec764c70aeb3345dd328a91e9cc4ae0dfbd0b3e67d"} Apr 16 18:17:49.909473 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:49.909449 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" event={"ID":"fe3b26c2-ed43-4847-9ae9-44c0b6350d49","Type":"ContainerStarted","Data":"64352d66817705c869376ac26c53cd86bdb7ccde8ca58c51bc44d3dbe58ac693"} Apr 16 18:17:50.314805 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:50.312990 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:50.314805 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:50.313187 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:50.314805 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:50.313250 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.31323049 +0000 UTC m=+6.092694804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:50.413984 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:50.413895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:50.414165 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:50.414075 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:50.414165 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:50.414162 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:50.414289 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:50.414178 2570 projected.go:194] Error preparing data for projected volume kube-api-access-nsd49 for pod openshift-network-diagnostics/network-check-target-m54zx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:50.414289 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:50.414240 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49 podName:ce22102c-2dd2-4a4f-8317-5733e81186d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.414221603 +0000 UTC m=+6.193685934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nsd49" (UniqueName: "kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49") pod "network-check-target-m54zx" (UID: "ce22102c-2dd2-4a4f-8317-5733e81186d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:50.834488 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:50.834300 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:50.834488 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:50.834458 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:17:50.952612 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:50.952571 2570 generic.go:358] "Generic (PLEG): container finished" podID="d8fd7e10da2386e75452a7c51276b1d0" containerID="99ad2676537078c993e4e88a0126e0dfd74a50e68a27d555caa20526aa81a037" exitCode=0 Apr 16 18:17:50.952886 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:50.952823 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" event={"ID":"d8fd7e10da2386e75452a7c51276b1d0","Type":"ContainerDied","Data":"99ad2676537078c993e4e88a0126e0dfd74a50e68a27d555caa20526aa81a037"} Apr 16 18:17:51.833919 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:51.833884 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:51.834117 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:51.834015 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:17:51.961756 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:51.961724 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" event={"ID":"d8fd7e10da2386e75452a7c51276b1d0","Type":"ContainerStarted","Data":"274e4f680b287c66837d69fafed6db4752d20f9145aaa72e923ba0b8454bbeae"} Apr 16 18:17:51.977878 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:51.977808 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-74.ec2.internal" podStartSLOduration=3.977790925 podStartE2EDuration="3.977790925s" podCreationTimestamp="2026-04-16 18:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:51.977758011 +0000 UTC m=+5.757222342" watchObservedRunningTime="2026-04-16 18:17:51.977790925 +0000 UTC m=+5.757255257" Apr 16 18:17:52.328340 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:52.328293 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:52.328550 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:52.328450 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:52.328550 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:52.328515 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:17:56.328494097 +0000 UTC m=+10.107958413 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:52.428996 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:52.428959 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:52.429196 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:52.429145 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:52.429196 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:52.429168 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:52.429196 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:52.429181 2570 projected.go:194] Error preparing data for projected volume kube-api-access-nsd49 for pod openshift-network-diagnostics/network-check-target-m54zx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:52.429356 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:52.429246 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49 podName:ce22102c-2dd2-4a4f-8317-5733e81186d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:56.429227294 +0000 UTC m=+10.208691611 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nsd49" (UniqueName: "kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49") pod "network-check-target-m54zx" (UID: "ce22102c-2dd2-4a4f-8317-5733e81186d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:52.834038 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:52.834007 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:52.834223 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:52.834165 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:17:53.834646 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:53.834611 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:53.835088 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:53.834747 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:17:54.834624 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:54.834138 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:54.834624 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:54.834277 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:17:55.834851 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:55.833650 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:55.834851 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:55.833787 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:17:56.360887 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:56.360258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:56.360887 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:56.360455 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:56.360887 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:56.360522 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:04.360502176 +0000 UTC m=+18.139966510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:56.462032 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:56.461694 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:56.462032 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:56.461850 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:56.462032 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:56.461870 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:56.462032 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:56.461885 2570 projected.go:194] Error preparing data for projected volume kube-api-access-nsd49 for pod openshift-network-diagnostics/network-check-target-m54zx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:56.462032 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:56.461949 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49 podName:ce22102c-2dd2-4a4f-8317-5733e81186d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:04.461929767 +0000 UTC m=+18.241394081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nsd49" (UniqueName: "kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49") pod "network-check-target-m54zx" (UID: "ce22102c-2dd2-4a4f-8317-5733e81186d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:56.834836 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:56.834805 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:56.834991 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:56.834923 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:17:57.834237 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:57.834202 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:57.834486 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:57.834327 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:17:58.837309 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:58.837279 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:17:58.837689 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:58.837428 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:17:59.834138 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:17:59.834104 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:17:59.834327 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:17:59.834213 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:00.834436 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:00.834399 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:00.834881 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:00.834583 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:01.834371 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:01.834330 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:01.834564 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:01.834462 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:02.834612 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:02.834572 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:02.835048 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:02.834722 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:03.833870 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:03.833840 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:03.834101 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:03.833946 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:04.416699 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:04.416664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:04.417138 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:04.416814 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:18:04.417138 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:04.416877 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:20.416859425 +0000 UTC m=+34.196323737 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:18:04.517087 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:04.517035 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:04.517320 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:04.517300 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:18:04.517371 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:04.517329 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:18:04.517371 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:04.517345 2570 projected.go:194] Error preparing data for projected volume kube-api-access-nsd49 for pod openshift-network-diagnostics/network-check-target-m54zx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:18:04.517439 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:04.517430 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49 podName:ce22102c-2dd2-4a4f-8317-5733e81186d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:20.51740973 +0000 UTC m=+34.296874055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nsd49" (UniqueName: "kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49") pod "network-check-target-m54zx" (UID: "ce22102c-2dd2-4a4f-8317-5733e81186d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:18:04.834030 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:04.834000 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:04.834218 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:04.834129 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:05.834109 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:05.834072 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:05.834472 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:05.834178 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:06.835106 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:06.834745 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:06.835106 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:06.835017 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:06.990140 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:06.990105 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" event={"ID":"fe3b26c2-ed43-4847-9ae9-44c0b6350d49","Type":"ContainerStarted","Data":"3a3a718b4311c000520ea4808037eb15ea559acd8d5ad85dc0a791fcb2d21a22"} Apr 16 18:18:06.991324 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:06.991299 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rlk99" event={"ID":"0aa834c9-7b5e-44dc-a706-cf8d7ff11391","Type":"ContainerStarted","Data":"198d0dfe91a8eadc3964bf9b702098d250f4bb586baca9d50205ed6219500b9d"} Apr 16 18:18:06.992333 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:06.992314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cmxs" event={"ID":"1b45981e-9576-4b1b-b941-35f68d109c84","Type":"ContainerStarted","Data":"268b02524c759f04af5e5a9baf85b39f9d1eaf8ae31f6aa51a9ba3fd1e4d550b"} Apr 16 18:18:06.993750 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:06.993730 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"ad7cd24191da01708b4e305592208bd02c645fcb65ebd6a6dff64c35ca6efd08"} Apr 16 18:18:06.993826 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:06.993758 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"f1e87f61b948179448d610008f24c8305735bd6f9aba11427f0acd3d16d1f984"} Apr 16 18:18:06.995171 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:06.995153 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s2j9l" event={"ID":"a9622aca-ffc8-4b50-82e0-a1c82e6222df","Type":"ContainerStarted","Data":"1d160ee39c7abc510204120c08bb4b8cda93208932ed360fabc2bc3bed63c098"} Apr 16 18:18:07.001548 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.001515 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2h4fb" event={"ID":"4735317d-b557-4ca9-84cd-02f72096e33a","Type":"ContainerStarted","Data":"fb7548a431292ac8a7c24879231154ce8b289c008960c22b743580dfc26822d9"} Apr 16 18:18:07.002752 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.002729 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" event={"ID":"bebba8c7-a52f-4acb-a6b1-afb778d88a5b","Type":"ContainerStarted","Data":"39f52fc2091562dbadc1f983f47cf424e1a5a9dba82e4f01d2e28c8b0cab5411"} Apr 16 18:18:07.003828 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.003809 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerStarted","Data":"f64d15da459d37b8c4098647605b090195fe5ac06411613c7f6ebfed471049c3"} Apr 16 18:18:07.009676 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.009633 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jrf5b" podStartSLOduration=3.836838536 podStartE2EDuration="21.009606669s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.355992023 +0000 UTC m=+3.135456340" lastFinishedPulling="2026-04-16 18:18:06.528760152 +0000 UTC m=+20.308224473" observedRunningTime="2026-04-16 18:18:07.009585828 +0000 UTC m=+20.789050171" watchObservedRunningTime="2026-04-16 18:18:07.009606669 +0000 UTC m=+20.789070982" Apr 16 18:18:07.026318 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.026277 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rlk99" podStartSLOduration=3.854946236 podStartE2EDuration="21.026265528s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.354479252 +0000 UTC m=+3.133943561" lastFinishedPulling="2026-04-16 18:18:06.525798537 +0000 UTC m=+20.305262853" observedRunningTime="2026-04-16 18:18:07.025842924 +0000 UTC m=+20.805307255" watchObservedRunningTime="2026-04-16 18:18:07.026265528 +0000 UTC m=+20.805729864" Apr 16 18:18:07.066136 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.066093 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2h4fb" podStartSLOduration=3.9049263180000002 podStartE2EDuration="21.066077977s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.364825097 +0000 UTC m=+3.144289405" lastFinishedPulling="2026-04-16 18:18:06.52597674 +0000 UTC m=+20.305441064" observedRunningTime="2026-04-16 18:18:07.065997377 +0000 UTC m=+20.845461721" watchObservedRunningTime="2026-04-16 18:18:07.066077977 +0000 UTC m=+20.845542305" Apr 16 18:18:07.091420 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.091373 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6cmxs" podStartSLOduration=3.916583454 podStartE2EDuration="21.091359245s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.36398336 +0000 UTC m=+3.143447690" lastFinishedPulling="2026-04-16 18:18:06.538759162 +0000 UTC m=+20.318223481" observedRunningTime="2026-04-16 18:18:07.09095063 +0000 UTC m=+20.870414962" watchObservedRunningTime="2026-04-16 18:18:07.091359245 +0000 UTC m=+20.870823575" Apr 16 18:18:07.112372 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.112326 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-s2j9l" podStartSLOduration=12.110834239 podStartE2EDuration="21.112311341s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.365405533 +0000 UTC m=+3.144869846" lastFinishedPulling="2026-04-16 18:17:58.366882633 +0000 UTC m=+12.146346948" observedRunningTime="2026-04-16 18:18:07.111831936 +0000 UTC m=+20.891296288" watchObservedRunningTime="2026-04-16 18:18:07.112311341 +0000 UTC m=+20.891775672" Apr 16 18:18:07.834003 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:07.833835 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:07.834142 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:07.834096 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:08.008039 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.008002 2570 generic.go:358] "Generic (PLEG): container finished" podID="a3821f1e-3cf4-4526-9175-97c1251899f2" containerID="f64d15da459d37b8c4098647605b090195fe5ac06411613c7f6ebfed471049c3" exitCode=0 Apr 16 18:18:08.008810 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.008083 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerDied","Data":"f64d15da459d37b8c4098647605b090195fe5ac06411613c7f6ebfed471049c3"} Apr 16 18:18:08.009375 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.009353 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vchm9" event={"ID":"1f6f728e-b45e-456c-8a7f-87bf91ba5c03","Type":"ContainerStarted","Data":"203a8581441896a7db3824174f8c28acd5d407fbcaa9e35012a538253a8c4807"} Apr 16 18:18:08.011453 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.011437 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:18:08.011721 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.011704 2570 generic.go:358] "Generic (PLEG): container finished" podID="8cc82835-e3e6-46d3-8f2f-ead7027b1b91" containerID="ad7cd24191da01708b4e305592208bd02c645fcb65ebd6a6dff64c35ca6efd08" exitCode=1 Apr 16 18:18:08.011806 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.011786 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerDied","Data":"ad7cd24191da01708b4e305592208bd02c645fcb65ebd6a6dff64c35ca6efd08"} Apr 16 18:18:08.011861 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.011819 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"1610a688309a1962cd375355f0a86eb248338b0e4aaba84436e030e3d6282ba1"} Apr 16 18:18:08.011861 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.011837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"1f02c8cbe66d2bb070fad5eff48f74cf0ab28e00d7d919b90c050b0c08dc53a6"} Apr 16 18:18:08.011861 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.011850 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"11e9d202c76cb64205683a9093eb00abffd6a3c9d3e0c4d3a8f16757c71ff6d7"} Apr 16 18:18:08.011997 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.011865 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"15fbdce76ff2c504641de2e16233be1b889418aadbf1957fc6ab2bd1115751b5"} Apr 16 18:18:08.050017 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.049974 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vchm9" podStartSLOduration=4.912313473 podStartE2EDuration="22.049960757s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.364115633 +0000 UTC m=+3.143579945" lastFinishedPulling="2026-04-16 18:18:06.501762909 +0000 UTC m=+20.281227229" observedRunningTime="2026-04-16 18:18:08.049849866 +0000 UTC m=+21.829314196" watchObservedRunningTime="2026-04-16 18:18:08.049960757 +0000 UTC m=+21.829425109" Apr 16 18:18:08.229532 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.229507 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:18:08.754265 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.754139 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:18:08.229525333Z","UUID":"f1f6e92a-ed5b-4473-94b0-7893cab1cf35","Handler":null,"Name":"","Endpoint":""} Apr 16 18:18:08.756082 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.756044 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:18:08.756199 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.756092 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:18:08.834642 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:08.834611 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:08.834837 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:08.834753 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:09.015765 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:09.015657 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" event={"ID":"bebba8c7-a52f-4acb-a6b1-afb778d88a5b","Type":"ContainerStarted","Data":"5047769cf242129039b8a3b7f012d78d28c34e4370d692077a633e4c00eabe2a"} Apr 16 18:18:09.834406 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:09.834231 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:09.834595 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:09.834510 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:10.020391 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:10.020358 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:18:10.020840 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:10.020722 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"79f821107019a269b494084abce3ea7ce24cdf75e4e15ca47be1b82c1ea39594"} Apr 16 18:18:10.022760 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:10.022738 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" event={"ID":"bebba8c7-a52f-4acb-a6b1-afb778d88a5b","Type":"ContainerStarted","Data":"89ad0322559e2977a3dcdd68d891a66034153a9ddf8fb9b137bd85a5914a030d"} Apr 16 18:18:10.044225 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:10.044168 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jh8xl" podStartSLOduration=3.003249898 podStartE2EDuration="23.044150794s" podCreationTimestamp="2026-04-16 18:17:47 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.361898067 +0000 UTC m=+3.141362381" lastFinishedPulling="2026-04-16 18:18:09.402798965 +0000 UTC m=+23.182263277" observedRunningTime="2026-04-16 18:18:10.043790285 +0000 UTC m=+23.823254621" watchObservedRunningTime="2026-04-16 18:18:10.044150794 +0000 UTC m=+23.823615128" Apr 16 18:18:10.400303 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:10.400262 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:18:10.400965 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:10.400941 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:18:10.834040 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:10.834011 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:10.834267 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:10.834154 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:11.024377 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:11.024343 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:18:11.024902 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:11.024882 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rlk99" Apr 16 18:18:11.834368 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:11.834335 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:11.834550 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:11.834462 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:12.033362 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.033335 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:18:12.036030 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.033751 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"bbea74177b54a96d4a419f60c1557278f0374e18cc68c7122564e902c3c7b35a"} Apr 16 18:18:12.036030 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.034289 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:18:12.036030 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.034369 2570 scope.go:117] "RemoveContainer" containerID="ad7cd24191da01708b4e305592208bd02c645fcb65ebd6a6dff64c35ca6efd08" Apr 16 18:18:12.036030 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.034405 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:18:12.036030 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.034488 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:18:12.053950 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.053925 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:18:12.054938 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.054920 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:18:12.833982 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:12.833726 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:12.834168 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:12.834141 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:13.037434 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.037396 2570 generic.go:358] "Generic (PLEG): container finished" podID="a3821f1e-3cf4-4526-9175-97c1251899f2" containerID="bbe11fa4a4ff9c136b4c82c606b3345b2638579f33582fefffc9884b24cbbd0e" exitCode=0 Apr 16 18:18:13.037882 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.037483 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerDied","Data":"bbe11fa4a4ff9c136b4c82c606b3345b2638579f33582fefffc9884b24cbbd0e"} Apr 16 18:18:13.040708 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.040691 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:18:13.041015 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.040985 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" event={"ID":"8cc82835-e3e6-46d3-8f2f-ead7027b1b91","Type":"ContainerStarted","Data":"d1b4f49a7bf3f6e3c33af0760804c5a85640a27631e8bcecf5008bf1d007a858"} Apr 16 18:18:13.094321 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.094224 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" podStartSLOduration=9.870656198 podStartE2EDuration="27.09421153s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.356005478 +0000 UTC m=+3.135469790" lastFinishedPulling="2026-04-16 18:18:06.579560807 +0000 UTC m=+20.359025122" observedRunningTime="2026-04-16 18:18:13.093705961 +0000 UTC m=+26.873170291" watchObservedRunningTime="2026-04-16 18:18:13.09421153 +0000 UTC m=+26.873675860" Apr 16 18:18:13.834345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.834315 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:13.834493 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:13.834415 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:13.935127 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.935093 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m54zx"] Apr 16 18:18:13.937041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.936998 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dvxrp"] Apr 16 18:18:13.937174 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:13.937154 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:13.937323 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:13.937286 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:14.043344 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:14.043318 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:14.043691 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:14.043413 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:15.046992 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:15.046958 2570 generic.go:358] "Generic (PLEG): container finished" podID="a3821f1e-3cf4-4526-9175-97c1251899f2" containerID="6e5a6c9eeb4b1342db23048eb333a9385951ac5272363fa60586eadae2b08d8a" exitCode=0 Apr 16 18:18:15.047386 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:15.047007 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerDied","Data":"6e5a6c9eeb4b1342db23048eb333a9385951ac5272363fa60586eadae2b08d8a"} Apr 16 18:18:15.834485 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:15.834310 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:15.834626 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:15.834371 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:15.834626 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:15.834596 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:15.834729 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:15.834705 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:17.052660 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:17.052626 2570 generic.go:358] "Generic (PLEG): container finished" podID="a3821f1e-3cf4-4526-9175-97c1251899f2" containerID="0edcfe6d4b20a27fb3c8cbe185af93735e9417f6b348f26e90b65873baa9ca4e" exitCode=0 Apr 16 18:18:17.053024 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:17.052670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerDied","Data":"0edcfe6d4b20a27fb3c8cbe185af93735e9417f6b348f26e90b65873baa9ca4e"} Apr 16 18:18:17.834447 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:17.834415 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:17.834636 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:17.834415 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:17.834636 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:17.834548 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m54zx" podUID="ce22102c-2dd2-4a4f-8317-5733e81186d1" Apr 16 18:18:17.834636 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:17.834589 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:18:19.556432 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.556401 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-74.ec2.internal" event="NodeReady" Apr 16 18:18:19.557094 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.556555 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:18:19.603698 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.603662 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-brklz"] Apr 16 18:18:19.618192 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.618154 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v2f4d"] Apr 16 18:18:19.618379 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.618341 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.620997 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.620792 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8xm9h\"" Apr 16 18:18:19.620997 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.620808 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:18:19.620997 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.620884 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:18:19.636624 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.636556 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-brklz"] Apr 16 18:18:19.636624 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.636585 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v2f4d"] Apr 16 18:18:19.636773 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.636703 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:19.639366 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.639340 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:18:19.639492 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.639387 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:18:19.639492 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.639394 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:18:19.639492 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.639398 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxx2m\"" Apr 16 18:18:19.733916 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.733887 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvbx\" (UniqueName: \"kubernetes.io/projected/67ddeef6-939c-4d8e-83ee-0673f748cf12-kube-api-access-2tvbx\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.734120 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.733928 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:19.734120 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.733960 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.734120 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.733983 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ddeef6-939c-4d8e-83ee-0673f748cf12-config-volume\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.734292 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.734193 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67ddeef6-939c-4d8e-83ee-0673f748cf12-tmp-dir\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.734292 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.734232 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-kube-api-access-lzkc9\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:19.834503 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.834466 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:19.834706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.834503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-kube-api-access-lzkc9\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:19.834706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.834553 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvbx\" (UniqueName: \"kubernetes.io/projected/67ddeef6-939c-4d8e-83ee-0673f748cf12-kube-api-access-2tvbx\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.834706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.834477 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:19.834706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.834587 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:19.834706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.834617 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.834706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.834648 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ddeef6-939c-4d8e-83ee-0673f748cf12-config-volume\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.834983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.834734 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67ddeef6-939c-4d8e-83ee-0673f748cf12-tmp-dir\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.834983 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:19.834861 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:19.834983 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:19.834934 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls podName:67ddeef6-939c-4d8e-83ee-0673f748cf12 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:20.334912746 +0000 UTC m=+34.114377069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls") pod "dns-default-brklz" (UID: "67ddeef6-939c-4d8e-83ee-0673f748cf12") : secret "dns-default-metrics-tls" not found Apr 16 18:18:19.834983 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:19.834953 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:19.835202 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:19.835002 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert podName:935e77e2-8cb8-4a46-ac22-24ad0a5b649a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:20.334984574 +0000 UTC m=+34.114448897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert") pod "ingress-canary-v2f4d" (UID: "935e77e2-8cb8-4a46-ac22-24ad0a5b649a") : secret "canary-serving-cert" not found Apr 16 18:18:19.835202 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.835191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67ddeef6-939c-4d8e-83ee-0673f748cf12-tmp-dir\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.835386 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.835369 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ddeef6-939c-4d8e-83ee-0673f748cf12-config-volume\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.837831 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.837652 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:18:19.837831 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.837652 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kzxcx\"" Apr 16 18:18:19.837831 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.837700 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:18:19.837831 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.837700 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:18:19.838145 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.837957 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k7xtv\"" Apr 16 18:18:19.849102 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.849074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvbx\" (UniqueName: \"kubernetes.io/projected/67ddeef6-939c-4d8e-83ee-0673f748cf12-kube-api-access-2tvbx\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:19.849220 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:19.849115 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-kube-api-access-lzkc9\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:20.338749 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:20.338717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:20.338980 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:20.338760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:20.338980 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:20.338889 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:20.338980 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:20.338918 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:20.338980 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:20.338975 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert podName:935e77e2-8cb8-4a46-ac22-24ad0a5b649a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:21.338951453 +0000 UTC m=+35.118415775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert") pod "ingress-canary-v2f4d" (UID: "935e77e2-8cb8-4a46-ac22-24ad0a5b649a") : secret "canary-serving-cert" not found Apr 16 18:18:20.339198 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:20.338994 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls podName:67ddeef6-939c-4d8e-83ee-0673f748cf12 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:21.338987211 +0000 UTC m=+35.118451520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls") pod "dns-default-brklz" (UID: "67ddeef6-939c-4d8e-83ee-0673f748cf12") : secret "dns-default-metrics-tls" not found Apr 16 18:18:20.440041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:20.440003 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:20.440209 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:20.440172 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:20.440259 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:20.440235 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:18:52.440219151 +0000 UTC m=+66.219683460 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : secret "metrics-daemon-secret" not found Apr 16 18:18:20.541265 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:20.541226 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:20.544168 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:20.544144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsd49\" (UniqueName: \"kubernetes.io/projected/ce22102c-2dd2-4a4f-8317-5733e81186d1-kube-api-access-nsd49\") pod \"network-check-target-m54zx\" (UID: \"ce22102c-2dd2-4a4f-8317-5733e81186d1\") " pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:20.746466 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:20.746382 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:20.919520 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:20.919332 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m54zx"] Apr 16 18:18:20.923801 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:18:20.923772 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce22102c_2dd2_4a4f_8317_5733e81186d1.slice/crio-e5178c8e865674ca4d18b940eda0a1858fe368441b8b3bc518e4df9aac24dda3 WatchSource:0}: Error finding container e5178c8e865674ca4d18b940eda0a1858fe368441b8b3bc518e4df9aac24dda3: Status 404 returned error can't find the container with id e5178c8e865674ca4d18b940eda0a1858fe368441b8b3bc518e4df9aac24dda3 Apr 16 18:18:21.061947 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:21.061897 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m54zx" event={"ID":"ce22102c-2dd2-4a4f-8317-5733e81186d1","Type":"ContainerStarted","Data":"e5178c8e865674ca4d18b940eda0a1858fe368441b8b3bc518e4df9aac24dda3"} Apr 16 18:18:21.346305 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:21.346228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:21.346305 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:21.346271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:21.346597 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:21.346410 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:21.346597 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:21.346475 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert podName:935e77e2-8cb8-4a46-ac22-24ad0a5b649a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:23.346458676 +0000 UTC m=+37.125922991 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert") pod "ingress-canary-v2f4d" (UID: "935e77e2-8cb8-4a46-ac22-24ad0a5b649a") : secret "canary-serving-cert" not found Apr 16 18:18:21.346597 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:21.346488 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:21.346597 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:21.346562 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls podName:67ddeef6-939c-4d8e-83ee-0673f748cf12 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:23.346543215 +0000 UTC m=+37.126007536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls") pod "dns-default-brklz" (UID: "67ddeef6-939c-4d8e-83ee-0673f748cf12") : secret "dns-default-metrics-tls" not found Apr 16 18:18:23.363093 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:23.363035 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:23.363757 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:23.363118 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:23.363757 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:23.363171 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:23.363757 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:23.363254 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert podName:935e77e2-8cb8-4a46-ac22-24ad0a5b649a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:27.363233399 +0000 UTC m=+41.142697709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert") pod "ingress-canary-v2f4d" (UID: "935e77e2-8cb8-4a46-ac22-24ad0a5b649a") : secret "canary-serving-cert" not found Apr 16 18:18:23.363757 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:23.363279 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:23.363757 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:23.363334 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls podName:67ddeef6-939c-4d8e-83ee-0673f748cf12 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:27.363320664 +0000 UTC m=+41.142784973 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls") pod "dns-default-brklz" (UID: "67ddeef6-939c-4d8e-83ee-0673f748cf12") : secret "dns-default-metrics-tls" not found Apr 16 18:18:25.072177 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:25.072145 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m54zx" event={"ID":"ce22102c-2dd2-4a4f-8317-5733e81186d1","Type":"ContainerStarted","Data":"5a9156cb19df4ccdbb6e2ce0a438c5e0e9d2bd5e1417a0f7df23b0f8d6801b97"} Apr 16 18:18:25.072555 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:25.072240 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:18:25.074450 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:25.074428 2570 generic.go:358] "Generic (PLEG): container finished" podID="a3821f1e-3cf4-4526-9175-97c1251899f2" containerID="88e107f9a721b18931ab1a35df800f90120be6316d38c750232b9b30d638b85f" exitCode=0 Apr 16 18:18:25.074571 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:25.074461 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerDied","Data":"88e107f9a721b18931ab1a35df800f90120be6316d38c750232b9b30d638b85f"} Apr 16 18:18:25.091467 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:25.091419 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m54zx" podStartSLOduration=35.212121025 podStartE2EDuration="39.091401096s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:18:20.926498883 +0000 UTC m=+34.705963192" lastFinishedPulling="2026-04-16 18:18:24.805778951 +0000 UTC m=+38.585243263" observedRunningTime="2026-04-16 18:18:25.089718988 +0000 UTC m=+38.869183332" watchObservedRunningTime="2026-04-16 18:18:25.091401096 +0000 UTC m=+38.870865429" Apr 16 18:18:26.081384 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:26.081346 2570 generic.go:358] "Generic (PLEG): container finished" podID="a3821f1e-3cf4-4526-9175-97c1251899f2" containerID="adc3a677dd5263c3660c1bc00a6ab00c0013d0d5ba9c21b88b93243cd6a4616c" exitCode=0 Apr 16 18:18:26.081867 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:26.081431 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerDied","Data":"adc3a677dd5263c3660c1bc00a6ab00c0013d0d5ba9c21b88b93243cd6a4616c"} Apr 16 18:18:27.085499 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:27.085465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" event={"ID":"a3821f1e-3cf4-4526-9175-97c1251899f2","Type":"ContainerStarted","Data":"19c1f861c19541298ff60c03731ad1e9a073eb50f2c3238f7210643b4cdc1422"} Apr 16 18:18:27.111804 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:27.111754 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qzpw4" podStartSLOduration=5.654112341 podStartE2EDuration="41.111739707s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.360467941 +0000 UTC m=+3.139932259" lastFinishedPulling="2026-04-16 18:18:24.818095303 +0000 UTC m=+38.597559625" observedRunningTime="2026-04-16 18:18:27.111532075 +0000 UTC m=+40.890996428" watchObservedRunningTime="2026-04-16 18:18:27.111739707 +0000 UTC m=+40.891204254" Apr 16 18:18:27.393096 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:27.392980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:27.393096 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:27.393032 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:27.393290 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:27.393146 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:27.393290 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:27.393147 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:27.393290 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:27.393199 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls podName:67ddeef6-939c-4d8e-83ee-0673f748cf12 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:35.39318501 +0000 UTC m=+49.172649318 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls") pod "dns-default-brklz" (UID: "67ddeef6-939c-4d8e-83ee-0673f748cf12") : secret "dns-default-metrics-tls" not found Apr 16 18:18:27.393290 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:27.393212 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert podName:935e77e2-8cb8-4a46-ac22-24ad0a5b649a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:35.39320676 +0000 UTC m=+49.172671069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert") pod "ingress-canary-v2f4d" (UID: "935e77e2-8cb8-4a46-ac22-24ad0a5b649a") : secret "canary-serving-cert" not found Apr 16 18:18:35.444741 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:35.444700 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:35.444741 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:35.444744 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:35.445231 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:35.444857 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:35.445231 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:35.444860 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:35.445231 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:35.444908 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls podName:67ddeef6-939c-4d8e-83ee-0673f748cf12 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:51.444895641 +0000 UTC m=+65.224359950 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls") pod "dns-default-brklz" (UID: "67ddeef6-939c-4d8e-83ee-0673f748cf12") : secret "dns-default-metrics-tls" not found Apr 16 18:18:35.445231 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:35.444921 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert podName:935e77e2-8cb8-4a46-ac22-24ad0a5b649a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:51.444915564 +0000 UTC m=+65.224379874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert") pod "ingress-canary-v2f4d" (UID: "935e77e2-8cb8-4a46-ac22-24ad0a5b649a") : secret "canary-serving-cert" not found Apr 16 18:18:44.058004 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:44.057972 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zps8z" Apr 16 18:18:51.452001 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:51.451963 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:18:51.452001 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:51.452005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:18:51.452504 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:51.452146 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:51.452504 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:51.452206 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls podName:67ddeef6-939c-4d8e-83ee-0673f748cf12 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:23.452191229 +0000 UTC m=+97.231655538 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls") pod "dns-default-brklz" (UID: "67ddeef6-939c-4d8e-83ee-0673f748cf12") : secret "dns-default-metrics-tls" not found Apr 16 18:18:51.452504 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:51.452148 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:51.452504 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:51.452287 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert podName:935e77e2-8cb8-4a46-ac22-24ad0a5b649a nodeName:}" failed. No retries permitted until 2026-04-16 18:19:23.452274473 +0000 UTC m=+97.231738783 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert") pod "ingress-canary-v2f4d" (UID: "935e77e2-8cb8-4a46-ac22-24ad0a5b649a") : secret "canary-serving-cert" not found Apr 16 18:18:52.458707 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:52.458664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:18:52.459092 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:52.458805 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:52.459092 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:18:52.458878 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:19:56.45886137 +0000 UTC m=+130.238325679 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : secret "metrics-daemon-secret" not found Apr 16 18:18:56.084149 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:18:56.084025 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m54zx" Apr 16 18:19:23.469003 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:23.468960 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:19:23.469003 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:23.469009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:19:23.469693 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:23.469164 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:19:23.469693 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:23.469251 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls podName:67ddeef6-939c-4d8e-83ee-0673f748cf12 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:27.469229377 +0000 UTC m=+161.248693686 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls") pod "dns-default-brklz" (UID: "67ddeef6-939c-4d8e-83ee-0673f748cf12") : secret "dns-default-metrics-tls" not found Apr 16 18:19:23.469693 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:23.469164 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:19:23.469693 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:23.469334 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert podName:935e77e2-8cb8-4a46-ac22-24ad0a5b649a nodeName:}" failed. No retries permitted until 2026-04-16 18:20:27.469318354 +0000 UTC m=+161.248782663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert") pod "ingress-canary-v2f4d" (UID: "935e77e2-8cb8-4a46-ac22-24ad0a5b649a") : secret "canary-serving-cert" not found Apr 16 18:19:56.498527 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.498470 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:19:56.499088 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:56.498632 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:19:56.499088 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:56.498701 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs podName:edeb92c2-9fa4-40ae-bb1a-a24372d25c5e nodeName:}" failed. No retries permitted until 2026-04-16 18:21:58.498685528 +0000 UTC m=+252.278149837 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs") pod "network-metrics-daemon-dvxrp" (UID: "edeb92c2-9fa4-40ae-bb1a-a24372d25c5e") : secret "metrics-daemon-secret" not found Apr 16 18:19:56.844473 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.844442 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl"] Apr 16 18:19:56.846983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.846966 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:56.849410 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.849387 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bhh7j\"" Apr 16 18:19:56.849512 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.849425 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:19:56.850563 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.850534 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:19:56.851730 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.851713 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:19:56.851986 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.851943 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:19:56.858340 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.858320 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl"] Apr 16 18:19:56.901481 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.901447 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ca4f330e-8728-4c07-ab6d-127e7f77538c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:56.901661 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.901499 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:56.901661 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.901557 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtkj6\" (UniqueName: \"kubernetes.io/projected/ca4f330e-8728-4c07-ab6d-127e7f77538c-kube-api-access-gtkj6\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:56.944282 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.944248 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-54596cf866-vm76c"] Apr 16 18:19:56.946916 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.946894 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:56.949750 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.949732 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:19:56.949841 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.949787 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:19:56.950161 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.950144 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:19:56.950231 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.950217 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6n6rl\"" Apr 16 18:19:56.950634 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.950617 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:19:56.950922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.950902 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:19:56.951111 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.951000 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:19:56.966438 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:56.966417 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-54596cf866-vm76c"] Apr 16 18:19:57.002696 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.002668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.002823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.002702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:57.002823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.002727 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-default-certificate\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.002823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.002769 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.002823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.002803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtkj6\" (UniqueName: \"kubernetes.io/projected/ca4f330e-8728-4c07-ab6d-127e7f77538c-kube-api-access-gtkj6\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:57.002823 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.002806 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:57.002981 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.002867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24qh\" (UniqueName: \"kubernetes.io/projected/f9efc8b2-a298-4a79-a57a-811175327ee2-kube-api-access-v24qh\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.002981 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.002881 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls podName:ca4f330e-8728-4c07-ab6d-127e7f77538c nodeName:}" failed. No retries permitted until 2026-04-16 18:19:57.502866115 +0000 UTC m=+131.282330424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-s6xhl" (UID: "ca4f330e-8728-4c07-ab6d-127e7f77538c") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:57.002981 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.002930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-stats-auth\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.002981 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.002953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ca4f330e-8728-4c07-ab6d-127e7f77538c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:57.003546 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.003528 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ca4f330e-8728-4c07-ab6d-127e7f77538c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:57.014983 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.014959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtkj6\" (UniqueName: \"kubernetes.io/projected/ca4f330e-8728-4c07-ab6d-127e7f77538c-kube-api-access-gtkj6\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:57.048358 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.048334 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5"] Apr 16 18:19:57.050959 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.050944 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td"] Apr 16 18:19:57.051120 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.051103 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.053396 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.053379 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.053840 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.053822 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:19:57.054031 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.054015 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:57.054122 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.054070 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ngf8r\"" Apr 16 18:19:57.054183 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.054142 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:57.054657 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.054639 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:19:57.055837 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.055820 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:57.055837 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.055827 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:19:57.056050 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.056033 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:19:57.056141 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.056068 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:57.056141 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.056069 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-ggh5g\"" Apr 16 18:19:57.062428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.062410 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td"] Apr 16 18:19:57.071815 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.071797 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5"] Apr 16 18:19:57.104124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104047 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-serving-cert\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.104124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-config\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.104124 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.104296 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104178 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v24qh\" (UniqueName: \"kubernetes.io/projected/f9efc8b2-a298-4a79-a57a-811175327ee2-kube-api-access-v24qh\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.104296 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104204 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-stats-auth\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.104296 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104223 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.104296 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.104417 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.104362 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:57.604349221 +0000 UTC m=+131.383813530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : configmap references non-existent config key: service-ca.crt Apr 16 18:19:57.104474 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104429 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j42dt\" (UniqueName: \"kubernetes.io/projected/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-kube-api-access-j42dt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.104474 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5wvc\" (UniqueName: \"kubernetes.io/projected/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-kube-api-access-p5wvc\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.104565 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104497 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-default-certificate\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.104565 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.104524 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.104708 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.104688 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:19:57.104763 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.104753 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:57.604740496 +0000 UTC m=+131.384204810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : secret "router-metrics-certs-default" not found Apr 16 18:19:57.106678 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.106656 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-default-certificate\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.106759 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.106686 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-stats-auth\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.113194 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.113176 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24qh\" (UniqueName: \"kubernetes.io/projected/f9efc8b2-a298-4a79-a57a-811175327ee2-kube-api-access-v24qh\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.205068 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.205026 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j42dt\" (UniqueName: \"kubernetes.io/projected/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-kube-api-access-j42dt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.205232 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.205084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5wvc\" (UniqueName: \"kubernetes.io/projected/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-kube-api-access-p5wvc\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.205232 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.205121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-serving-cert\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.205232 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.205137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-config\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.205232 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.205162 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.205424 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.205351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.205753 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.205731 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-config\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.205753 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.205745 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.207316 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.207299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-serving-cert\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.207427 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.207410 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.214270 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.214248 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5wvc\" (UniqueName: \"kubernetes.io/projected/39dcd2dc-e628-49b2-bd5e-aef8fe6aa083-kube-api-access-p5wvc\") pod \"service-ca-operator-69965bb79d-nq9t5\" (UID: \"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.214733 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.214713 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j42dt\" (UniqueName: \"kubernetes.io/projected/5fa9d23f-acec-46e2-b6bc-3203fdd2764d-kube-api-access-j42dt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wk6td\" (UID: \"5fa9d23f-acec-46e2-b6bc-3203fdd2764d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.361396 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.361315 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" Apr 16 18:19:57.366041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.366014 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" Apr 16 18:19:57.484807 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.484778 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5"] Apr 16 18:19:57.489766 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:19:57.489744 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39dcd2dc_e628_49b2_bd5e_aef8fe6aa083.slice/crio-670fe8b6f140b4a936ec572106d059bb14d1c1eb13029d82a7ae0ea604c07a42 WatchSource:0}: Error finding container 670fe8b6f140b4a936ec572106d059bb14d1c1eb13029d82a7ae0ea604c07a42: Status 404 returned error can't find the container with id 670fe8b6f140b4a936ec572106d059bb14d1c1eb13029d82a7ae0ea604c07a42 Apr 16 18:19:57.500868 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.500842 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td"] Apr 16 18:19:57.503994 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:19:57.503971 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fa9d23f_acec_46e2_b6bc_3203fdd2764d.slice/crio-9b20dc22c8fb277f1b114f2902f9d5238c7ce106ae76ddca3b650e0fedfa7357 WatchSource:0}: Error finding container 9b20dc22c8fb277f1b114f2902f9d5238c7ce106ae76ddca3b650e0fedfa7357: Status 404 returned error can't find the container with id 9b20dc22c8fb277f1b114f2902f9d5238c7ce106ae76ddca3b650e0fedfa7357 Apr 16 18:19:57.507737 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.507716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:57.507885 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.507869 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:57.507952 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.507946 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls podName:ca4f330e-8728-4c07-ab6d-127e7f77538c nodeName:}" failed. No retries permitted until 2026-04-16 18:19:58.507927023 +0000 UTC m=+132.287391347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-s6xhl" (UID: "ca4f330e-8728-4c07-ab6d-127e7f77538c") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:57.608312 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.608278 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.608312 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:57.608325 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:57.608522 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.608422 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:19:57.608522 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.608460 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:58.608441982 +0000 UTC m=+132.387906291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : configmap references non-existent config key: service-ca.crt Apr 16 18:19:57.608522 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:57.608496 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:58.608479981 +0000 UTC m=+132.387944290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : secret "router-metrics-certs-default" not found Apr 16 18:19:58.259894 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:58.259855 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" event={"ID":"5fa9d23f-acec-46e2-b6bc-3203fdd2764d","Type":"ContainerStarted","Data":"9b20dc22c8fb277f1b114f2902f9d5238c7ce106ae76ddca3b650e0fedfa7357"} Apr 16 18:19:58.260987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:58.260956 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" event={"ID":"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083","Type":"ContainerStarted","Data":"670fe8b6f140b4a936ec572106d059bb14d1c1eb13029d82a7ae0ea604c07a42"} Apr 16 18:19:58.515809 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:58.515728 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:19:58.516283 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:58.515851 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:58.516283 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:58.515924 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls podName:ca4f330e-8728-4c07-ab6d-127e7f77538c nodeName:}" failed. No retries permitted until 2026-04-16 18:20:00.515906511 +0000 UTC m=+134.295370841 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-s6xhl" (UID: "ca4f330e-8728-4c07-ab6d-127e7f77538c") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:58.617068 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:58.617016 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:58.617255 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:19:58.617232 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:19:58.617320 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:58.617311 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:00.617264662 +0000 UTC m=+134.396728984 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : configmap references non-existent config key: service-ca.crt Apr 16 18:19:58.617394 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:58.617351 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:19:58.617394 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:19:58.617393 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:00.617379839 +0000 UTC m=+134.396844161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : secret "router-metrics-certs-default" not found Apr 16 18:20:00.266336 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:00.266240 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" event={"ID":"5fa9d23f-acec-46e2-b6bc-3203fdd2764d","Type":"ContainerStarted","Data":"f5c15869671c33eaa5ca68d0a51f672b5322b43713ca964db3a10d9b76004ed4"} Apr 16 18:20:00.267613 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:00.267591 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" event={"ID":"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083","Type":"ContainerStarted","Data":"5aed8d8b9cbe72b34395a90a94041455cc450413cb644a7969308b0fd0116468"} Apr 16 18:20:00.288725 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:00.288685 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" podStartSLOduration=0.946490848 podStartE2EDuration="3.288673302s" podCreationTimestamp="2026-04-16 18:19:57 +0000 UTC" firstStartedPulling="2026-04-16 18:19:57.506159574 +0000 UTC m=+131.285623883" lastFinishedPulling="2026-04-16 18:19:59.848342021 +0000 UTC m=+133.627806337" observedRunningTime="2026-04-16 18:20:00.287998777 +0000 UTC m=+134.067463107" watchObservedRunningTime="2026-04-16 18:20:00.288673302 +0000 UTC m=+134.068137719" Apr 16 18:20:00.305659 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:00.305617 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" podStartSLOduration=0.949713353 podStartE2EDuration="3.305607321s" podCreationTimestamp="2026-04-16 18:19:57 +0000 UTC" firstStartedPulling="2026-04-16 18:19:57.491313599 +0000 UTC m=+131.270777908" lastFinishedPulling="2026-04-16 18:19:59.847207563 +0000 UTC m=+133.626671876" observedRunningTime="2026-04-16 18:20:00.305047513 +0000 UTC m=+134.084511844" watchObservedRunningTime="2026-04-16 18:20:00.305607321 +0000 UTC m=+134.085071652" Apr 16 18:20:00.534478 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:00.534432 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:20:00.534646 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:00.534587 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:20:00.534687 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:00.534655 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls podName:ca4f330e-8728-4c07-ab6d-127e7f77538c nodeName:}" failed. No retries permitted until 2026-04-16 18:20:04.534638477 +0000 UTC m=+138.314102786 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-s6xhl" (UID: "ca4f330e-8728-4c07-ab6d-127e7f77538c") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:20:00.635669 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:00.635621 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:00.635842 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:00.635690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:00.635842 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:00.635781 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:20:00.635842 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:00.635817 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:04.635794961 +0000 UTC m=+138.415259272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : configmap references non-existent config key: service-ca.crt Apr 16 18:20:00.635967 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:00.635850 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:04.635834001 +0000 UTC m=+138.415298312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : secret "router-metrics-certs-default" not found Apr 16 18:20:01.706701 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.706667 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9"] Apr 16 18:20:01.709883 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.709864 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" Apr 16 18:20:01.712411 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.712384 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:20:01.712411 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.712384 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:20:01.713378 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.713355 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-l8w6c\"" Apr 16 18:20:01.721339 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.721317 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9"] Apr 16 18:20:01.844274 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.844238 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdqhq\" (UniqueName: \"kubernetes.io/projected/b117bdfd-44c4-486e-ae7f-b512781456f8-kube-api-access-wdqhq\") pod \"migrator-64d4d94569-5nbn9\" (UID: \"b117bdfd-44c4-486e-ae7f-b512781456f8\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" Apr 16 18:20:01.944860 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.944822 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdqhq\" (UniqueName: \"kubernetes.io/projected/b117bdfd-44c4-486e-ae7f-b512781456f8-kube-api-access-wdqhq\") pod \"migrator-64d4d94569-5nbn9\" (UID: \"b117bdfd-44c4-486e-ae7f-b512781456f8\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" Apr 16 18:20:01.956585 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:01.956557 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdqhq\" (UniqueName: \"kubernetes.io/projected/b117bdfd-44c4-486e-ae7f-b512781456f8-kube-api-access-wdqhq\") pod \"migrator-64d4d94569-5nbn9\" (UID: \"b117bdfd-44c4-486e-ae7f-b512781456f8\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" Apr 16 18:20:02.019588 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:02.019560 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" Apr 16 18:20:02.136003 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:02.135973 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9"] Apr 16 18:20:02.138972 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:02.138931 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb117bdfd_44c4_486e_ae7f_b512781456f8.slice/crio-8fd6cc45e76c7cdc37a03196e16c709cd31dc5cbaf0df0bcb0c4d1b264c33075 WatchSource:0}: Error finding container 8fd6cc45e76c7cdc37a03196e16c709cd31dc5cbaf0df0bcb0c4d1b264c33075: Status 404 returned error can't find the container with id 8fd6cc45e76c7cdc37a03196e16c709cd31dc5cbaf0df0bcb0c4d1b264c33075 Apr 16 18:20:02.272128 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:02.272025 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" event={"ID":"b117bdfd-44c4-486e-ae7f-b512781456f8","Type":"ContainerStarted","Data":"8fd6cc45e76c7cdc37a03196e16c709cd31dc5cbaf0df0bcb0c4d1b264c33075"} Apr 16 18:20:03.195723 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.195697 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2h4fb_4735317d-b557-4ca9-84cd-02f72096e33a/dns-node-resolver/0.log" Apr 16 18:20:03.369406 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.369371 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qvlv9"] Apr 16 18:20:03.372704 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.372681 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.375458 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.375430 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:20:03.375577 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.375432 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-6hh2z\"" Apr 16 18:20:03.375577 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.375473 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:20:03.376500 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.376483 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:20:03.376603 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.376580 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:20:03.381654 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.381629 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qvlv9"] Apr 16 18:20:03.455931 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.455906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-signing-cabundle\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.456045 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.455948 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-signing-key\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.456045 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.455969 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dbff\" (UniqueName: \"kubernetes.io/projected/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-kube-api-access-8dbff\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.556825 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.556792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-signing-cabundle\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.556948 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.556848 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-signing-key\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.556948 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.556882 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dbff\" (UniqueName: \"kubernetes.io/projected/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-kube-api-access-8dbff\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.557554 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.557532 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-signing-cabundle\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.559321 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.559299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-signing-key\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.566755 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.566721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dbff\" (UniqueName: \"kubernetes.io/projected/1ffcd2a8-9465-438b-8f80-f8e575e15bfc-kube-api-access-8dbff\") pod \"service-ca-bfc587fb7-qvlv9\" (UID: \"1ffcd2a8-9465-438b-8f80-f8e575e15bfc\") " pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.687088 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.687036 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" Apr 16 18:20:03.801261 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:03.801231 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-qvlv9"] Apr 16 18:20:03.803855 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:03.803828 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ffcd2a8_9465_438b_8f80_f8e575e15bfc.slice/crio-24499bfe675cdddaeb32fead494a4c6db17301873dbbd59024b55b43eb90c763 WatchSource:0}: Error finding container 24499bfe675cdddaeb32fead494a4c6db17301873dbbd59024b55b43eb90c763: Status 404 returned error can't find the container with id 24499bfe675cdddaeb32fead494a4c6db17301873dbbd59024b55b43eb90c763 Apr 16 18:20:04.278924 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.278886 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" event={"ID":"b117bdfd-44c4-486e-ae7f-b512781456f8","Type":"ContainerStarted","Data":"0151d92f39bca9a7f9f49b57ef82e84e1e7216e6115065cb30c578831389b422"} Apr 16 18:20:04.279394 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.278932 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" event={"ID":"b117bdfd-44c4-486e-ae7f-b512781456f8","Type":"ContainerStarted","Data":"98a9851f586da3b4327712b0224d7c928c8ad9edabab185269df2a1cf7a6e5c9"} Apr 16 18:20:04.280217 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.280196 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" event={"ID":"1ffcd2a8-9465-438b-8f80-f8e575e15bfc","Type":"ContainerStarted","Data":"007917fd686935d2eef8f3a1af66de6c5eab59be39f6da141a2bf6cb7218c39c"} Apr 16 18:20:04.280322 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.280222 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" event={"ID":"1ffcd2a8-9465-438b-8f80-f8e575e15bfc","Type":"ContainerStarted","Data":"24499bfe675cdddaeb32fead494a4c6db17301873dbbd59024b55b43eb90c763"} Apr 16 18:20:04.302191 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.302120 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-5nbn9" podStartSLOduration=2.007098473 podStartE2EDuration="3.302105477s" podCreationTimestamp="2026-04-16 18:20:01 +0000 UTC" firstStartedPulling="2026-04-16 18:20:02.14075673 +0000 UTC m=+135.920221048" lastFinishedPulling="2026-04-16 18:20:03.43576374 +0000 UTC m=+137.215228052" observedRunningTime="2026-04-16 18:20:04.301567378 +0000 UTC m=+138.081031721" watchObservedRunningTime="2026-04-16 18:20:04.302105477 +0000 UTC m=+138.081569809" Apr 16 18:20:04.320569 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.320522 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-qvlv9" podStartSLOduration=1.320505806 podStartE2EDuration="1.320505806s" podCreationTimestamp="2026-04-16 18:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:04.319824337 +0000 UTC m=+138.099288681" watchObservedRunningTime="2026-04-16 18:20:04.320505806 +0000 UTC m=+138.099970521" Apr 16 18:20:04.564876 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.564776 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:20:04.565076 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:04.564931 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:20:04.565076 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:04.564996 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls podName:ca4f330e-8728-4c07-ab6d-127e7f77538c nodeName:}" failed. No retries permitted until 2026-04-16 18:20:12.564979381 +0000 UTC m=+146.344443703 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-s6xhl" (UID: "ca4f330e-8728-4c07-ab6d-127e7f77538c") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:20:04.597376 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.597353 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s2j9l_a9622aca-ffc8-4b50-82e0-a1c82e6222df/node-ca/0.log" Apr 16 18:20:04.665233 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.665201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:04.665391 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:04.665244 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:04.665391 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:04.665335 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:20:04.665391 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:04.665366 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:12.665345808 +0000 UTC m=+146.444810142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : configmap references non-existent config key: service-ca.crt Apr 16 18:20:04.665544 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:04.665422 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs podName:f9efc8b2-a298-4a79-a57a-811175327ee2 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:12.665413071 +0000 UTC m=+146.444877382 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs") pod "router-default-54596cf866-vm76c" (UID: "f9efc8b2-a298-4a79-a57a-811175327ee2") : secret "router-metrics-certs-default" not found Apr 16 18:20:12.623630 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:12.623589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:20:12.624096 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:12.623700 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:20:12.624096 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:12.623762 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls podName:ca4f330e-8728-4c07-ab6d-127e7f77538c nodeName:}" failed. No retries permitted until 2026-04-16 18:20:28.623747248 +0000 UTC m=+162.403211557 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-s6xhl" (UID: "ca4f330e-8728-4c07-ab6d-127e7f77538c") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:20:12.724335 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:12.724305 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:12.724471 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:12.724354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:12.725032 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:12.725008 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9efc8b2-a298-4a79-a57a-811175327ee2-service-ca-bundle\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:12.726802 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:12.726780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9efc8b2-a298-4a79-a57a-811175327ee2-metrics-certs\") pod \"router-default-54596cf866-vm76c\" (UID: \"f9efc8b2-a298-4a79-a57a-811175327ee2\") " pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:12.858980 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:12.858944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:12.977877 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:12.977846 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-54596cf866-vm76c"] Apr 16 18:20:12.981075 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:12.981023 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9efc8b2_a298_4a79_a57a_811175327ee2.slice/crio-9f088491c75723828df3f05243a2c0c47244b42b2bc96f72e8aa3497ce0b27a6 WatchSource:0}: Error finding container 9f088491c75723828df3f05243a2c0c47244b42b2bc96f72e8aa3497ce0b27a6: Status 404 returned error can't find the container with id 9f088491c75723828df3f05243a2c0c47244b42b2bc96f72e8aa3497ce0b27a6 Apr 16 18:20:13.306241 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:13.306205 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54596cf866-vm76c" event={"ID":"f9efc8b2-a298-4a79-a57a-811175327ee2","Type":"ContainerStarted","Data":"c9848b1621dff22ec177ecbe7f1c4082edd974f8b23fae886bb2c0a2d1a88cdf"} Apr 16 18:20:13.306405 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:13.306249 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54596cf866-vm76c" event={"ID":"f9efc8b2-a298-4a79-a57a-811175327ee2","Type":"ContainerStarted","Data":"9f088491c75723828df3f05243a2c0c47244b42b2bc96f72e8aa3497ce0b27a6"} Apr 16 18:20:13.328019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:13.327964 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-54596cf866-vm76c" podStartSLOduration=17.32794963 podStartE2EDuration="17.32794963s" podCreationTimestamp="2026-04-16 18:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:13.326830924 +0000 UTC m=+147.106295255" watchObservedRunningTime="2026-04-16 18:20:13.32794963 +0000 UTC m=+147.107414005" Apr 16 18:20:13.859309 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:13.859274 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:13.861663 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:13.861643 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:14.308363 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:14.308331 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:14.309614 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:14.309595 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-54596cf866-vm76c" Apr 16 18:20:22.630575 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:22.630505 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-brklz" podUID="67ddeef6-939c-4d8e-83ee-0673f748cf12" Apr 16 18:20:22.646697 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:22.646663 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-v2f4d" podUID="935e77e2-8cb8-4a46-ac22-24ad0a5b649a" Apr 16 18:20:22.859882 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:22.859839 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dvxrp" podUID="edeb92c2-9fa4-40ae-bb1a-a24372d25c5e" Apr 16 18:20:23.331488 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.331459 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:20:23.331659 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.331460 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-brklz" Apr 16 18:20:23.500351 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.500319 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bzcpl"] Apr 16 18:20:23.504462 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.504437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.514673 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.514651 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:20:23.514673 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.514667 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:20:23.514835 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.514676 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:20:23.519666 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.519649 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:20:23.524522 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.524503 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4fhn9\"" Apr 16 18:20:23.549414 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.549388 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bzcpl"] Apr 16 18:20:23.604226 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.604154 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0737d249-a705-41ad-b1ee-04446e7bdfce-data-volume\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.604226 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.604205 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0737d249-a705-41ad-b1ee-04446e7bdfce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.604401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.604298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0737d249-a705-41ad-b1ee-04446e7bdfce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.604401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.604345 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0737d249-a705-41ad-b1ee-04446e7bdfce-crio-socket\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.604401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.604377 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8897\" (UniqueName: \"kubernetes.io/projected/0737d249-a705-41ad-b1ee-04446e7bdfce-kube-api-access-w8897\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.704990 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.704956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0737d249-a705-41ad-b1ee-04446e7bdfce-data-volume\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.705401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.705001 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0737d249-a705-41ad-b1ee-04446e7bdfce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.705401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.705124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0737d249-a705-41ad-b1ee-04446e7bdfce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.705401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.705178 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0737d249-a705-41ad-b1ee-04446e7bdfce-crio-socket\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.705401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.705227 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8897\" (UniqueName: \"kubernetes.io/projected/0737d249-a705-41ad-b1ee-04446e7bdfce-kube-api-access-w8897\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.705401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.705328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0737d249-a705-41ad-b1ee-04446e7bdfce-crio-socket\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.705401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.705337 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0737d249-a705-41ad-b1ee-04446e7bdfce-data-volume\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.705606 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.705522 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0737d249-a705-41ad-b1ee-04446e7bdfce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.707400 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.707384 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0737d249-a705-41ad-b1ee-04446e7bdfce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.718157 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.718135 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8897\" (UniqueName: \"kubernetes.io/projected/0737d249-a705-41ad-b1ee-04446e7bdfce-kube-api-access-w8897\") pod \"insights-runtime-extractor-bzcpl\" (UID: \"0737d249-a705-41ad-b1ee-04446e7bdfce\") " pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.813667 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.813616 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bzcpl" Apr 16 18:20:23.936832 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:23.936802 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bzcpl"] Apr 16 18:20:23.939707 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:23.939679 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0737d249_a705_41ad_b1ee_04446e7bdfce.slice/crio-8323eb5947e0f3101c21f2b68b0314d8661b182e642bd279e9b2cc0a76ff7d76 WatchSource:0}: Error finding container 8323eb5947e0f3101c21f2b68b0314d8661b182e642bd279e9b2cc0a76ff7d76: Status 404 returned error can't find the container with id 8323eb5947e0f3101c21f2b68b0314d8661b182e642bd279e9b2cc0a76ff7d76 Apr 16 18:20:24.335718 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:24.335685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bzcpl" event={"ID":"0737d249-a705-41ad-b1ee-04446e7bdfce","Type":"ContainerStarted","Data":"c776802bfa7384f967c7a4da711ed762608e30c004829c282288532abdde93be"} Apr 16 18:20:24.335718 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:24.335721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bzcpl" event={"ID":"0737d249-a705-41ad-b1ee-04446e7bdfce","Type":"ContainerStarted","Data":"8323eb5947e0f3101c21f2b68b0314d8661b182e642bd279e9b2cc0a76ff7d76"} Apr 16 18:20:25.339616 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:25.339583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bzcpl" event={"ID":"0737d249-a705-41ad-b1ee-04446e7bdfce","Type":"ContainerStarted","Data":"e12ebd3e40418a00c583271ff4f3506fb69a33ae1874da38afd80142de274995"} Apr 16 18:20:27.345684 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.345645 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bzcpl" event={"ID":"0737d249-a705-41ad-b1ee-04446e7bdfce","Type":"ContainerStarted","Data":"44051dc4382b70a0b7c02fa1c9d48b0e51e85281d4e704dc62c178666b1be765"} Apr 16 18:20:27.368090 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.368023 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bzcpl" podStartSLOduration=1.841658982 podStartE2EDuration="4.368009921s" podCreationTimestamp="2026-04-16 18:20:23 +0000 UTC" firstStartedPulling="2026-04-16 18:20:23.995816065 +0000 UTC m=+157.775280382" lastFinishedPulling="2026-04-16 18:20:26.522166992 +0000 UTC m=+160.301631321" observedRunningTime="2026-04-16 18:20:27.36642646 +0000 UTC m=+161.145890791" watchObservedRunningTime="2026-04-16 18:20:27.368009921 +0000 UTC m=+161.147474277" Apr 16 18:20:27.535779 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.535749 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:20:27.535779 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.535785 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:20:27.537995 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.537971 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67ddeef6-939c-4d8e-83ee-0673f748cf12-metrics-tls\") pod \"dns-default-brklz\" (UID: \"67ddeef6-939c-4d8e-83ee-0673f748cf12\") " pod="openshift-dns/dns-default-brklz" Apr 16 18:20:27.538235 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.538215 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/935e77e2-8cb8-4a46-ac22-24ad0a5b649a-cert\") pod \"ingress-canary-v2f4d\" (UID: \"935e77e2-8cb8-4a46-ac22-24ad0a5b649a\") " pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:20:27.834515 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.834478 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8xm9h\"" Apr 16 18:20:27.835478 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.835460 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxx2m\"" Apr 16 18:20:27.842656 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.842638 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v2f4d" Apr 16 18:20:27.842750 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.842666 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-brklz" Apr 16 18:20:27.968995 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.968963 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v2f4d"] Apr 16 18:20:27.972357 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:27.972330 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935e77e2_8cb8_4a46_ac22_24ad0a5b649a.slice/crio-8fc8bfad2cae9a65434748bf9ec1e0fa429dda451a5c52c04bf7061d44892c53 WatchSource:0}: Error finding container 8fc8bfad2cae9a65434748bf9ec1e0fa429dda451a5c52c04bf7061d44892c53: Status 404 returned error can't find the container with id 8fc8bfad2cae9a65434748bf9ec1e0fa429dda451a5c52c04bf7061d44892c53 Apr 16 18:20:27.983950 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:27.983926 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-brklz"] Apr 16 18:20:27.986410 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:27.986384 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ddeef6_939c_4d8e_83ee_0673f748cf12.slice/crio-60e5de39c19af02052eb1f2c308c3f8247ab651e2582c05ea843f29f0a873e55 WatchSource:0}: Error finding container 60e5de39c19af02052eb1f2c308c3f8247ab651e2582c05ea843f29f0a873e55: Status 404 returned error can't find the container with id 60e5de39c19af02052eb1f2c308c3f8247ab651e2582c05ea843f29f0a873e55 Apr 16 18:20:28.349542 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:28.349497 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brklz" event={"ID":"67ddeef6-939c-4d8e-83ee-0673f748cf12","Type":"ContainerStarted","Data":"60e5de39c19af02052eb1f2c308c3f8247ab651e2582c05ea843f29f0a873e55"} Apr 16 18:20:28.350612 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:28.350580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v2f4d" event={"ID":"935e77e2-8cb8-4a46-ac22-24ad0a5b649a","Type":"ContainerStarted","Data":"8fc8bfad2cae9a65434748bf9ec1e0fa429dda451a5c52c04bf7061d44892c53"} Apr 16 18:20:28.646486 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:28.646397 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:20:28.649172 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:28.649144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca4f330e-8728-4c07-ab6d-127e7f77538c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-s6xhl\" (UID: \"ca4f330e-8728-4c07-ab6d-127e7f77538c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:20:28.656036 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:28.656007 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" Apr 16 18:20:28.808450 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:28.808415 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl"] Apr 16 18:20:29.354797 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:29.354760 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" event={"ID":"ca4f330e-8728-4c07-ab6d-127e7f77538c","Type":"ContainerStarted","Data":"bf6187f89e83b3b75c9c54db648c5c9aeefed46b2591c9c8a49123f12a591c79"} Apr 16 18:20:30.360135 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:30.360090 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v2f4d" event={"ID":"935e77e2-8cb8-4a46-ac22-24ad0a5b649a","Type":"ContainerStarted","Data":"1eed3f8063aa710992af8e664424be7fbec0023c78b38553c8495f4c94f9112a"} Apr 16 18:20:30.361857 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:30.361825 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brklz" event={"ID":"67ddeef6-939c-4d8e-83ee-0673f748cf12","Type":"ContainerStarted","Data":"671a3738c9f4a5b5aea757cfc3de9c559b7458eaee52e021fd2afbe468f1f14f"} Apr 16 18:20:30.361857 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:30.361859 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brklz" event={"ID":"67ddeef6-939c-4d8e-83ee-0673f748cf12","Type":"ContainerStarted","Data":"21555b274ff59e026f96f6c40cc0aeecb3a98c42c873f5e4c0d64c5b22629485"} Apr 16 18:20:30.362090 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:30.361968 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-brklz" Apr 16 18:20:30.383618 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:30.383563 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v2f4d" podStartSLOduration=129.454527502 podStartE2EDuration="2m11.383548309s" podCreationTimestamp="2026-04-16 18:18:19 +0000 UTC" firstStartedPulling="2026-04-16 18:20:27.974198365 +0000 UTC m=+161.753662678" lastFinishedPulling="2026-04-16 18:20:29.903219176 +0000 UTC m=+163.682683485" observedRunningTime="2026-04-16 18:20:30.38211399 +0000 UTC m=+164.161578320" watchObservedRunningTime="2026-04-16 18:20:30.383548309 +0000 UTC m=+164.163012634" Apr 16 18:20:30.405041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:30.404976 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-brklz" podStartSLOduration=129.488322907 podStartE2EDuration="2m11.404955606s" podCreationTimestamp="2026-04-16 18:18:19 +0000 UTC" firstStartedPulling="2026-04-16 18:20:27.98819795 +0000 UTC m=+161.767662259" lastFinishedPulling="2026-04-16 18:20:29.904830645 +0000 UTC m=+163.684294958" observedRunningTime="2026-04-16 18:20:30.403757627 +0000 UTC m=+164.183221960" watchObservedRunningTime="2026-04-16 18:20:30.404955606 +0000 UTC m=+164.184419938" Apr 16 18:20:31.366377 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.366275 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" event={"ID":"ca4f330e-8728-4c07-ab6d-127e7f77538c","Type":"ContainerStarted","Data":"20faa2d18c8bb219d0738864731ba81cd592772a5064fb298caf2f50c38d5332"} Apr 16 18:20:31.388395 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.388338 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-s6xhl" podStartSLOduration=33.094634398 podStartE2EDuration="35.388323103s" podCreationTimestamp="2026-04-16 18:19:56 +0000 UTC" firstStartedPulling="2026-04-16 18:20:28.815463341 +0000 UTC m=+162.594927657" lastFinishedPulling="2026-04-16 18:20:31.10915205 +0000 UTC m=+164.888616362" observedRunningTime="2026-04-16 18:20:31.386517371 +0000 UTC m=+165.165981703" watchObservedRunningTime="2026-04-16 18:20:31.388323103 +0000 UTC m=+165.167787471" Apr 16 18:20:31.630811 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.630725 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224"] Apr 16 18:20:31.633647 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.633627 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" Apr 16 18:20:31.636674 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.636648 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-qvl25\"" Apr 16 18:20:31.636807 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.636694 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:20:31.643654 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.643630 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224"] Apr 16 18:20:31.773201 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.773171 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a8800c73-8457-4de9-8473-aa5b62d40811-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-mh224\" (UID: \"a8800c73-8457-4de9-8473-aa5b62d40811\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" Apr 16 18:20:31.874422 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.874394 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a8800c73-8457-4de9-8473-aa5b62d40811-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-mh224\" (UID: \"a8800c73-8457-4de9-8473-aa5b62d40811\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" Apr 16 18:20:31.876929 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.876900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a8800c73-8457-4de9-8473-aa5b62d40811-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-mh224\" (UID: \"a8800c73-8457-4de9-8473-aa5b62d40811\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" Apr 16 18:20:31.944916 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:31.944811 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" Apr 16 18:20:32.065875 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:32.065847 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224"] Apr 16 18:20:32.068921 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:32.068889 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8800c73_8457_4de9_8473_aa5b62d40811.slice/crio-dc8633f0ab10bd1df79cbaf75b0c1832ce109bbe3474e87d73a66f29b21947e7 WatchSource:0}: Error finding container dc8633f0ab10bd1df79cbaf75b0c1832ce109bbe3474e87d73a66f29b21947e7: Status 404 returned error can't find the container with id dc8633f0ab10bd1df79cbaf75b0c1832ce109bbe3474e87d73a66f29b21947e7 Apr 16 18:20:32.369888 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:32.369853 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" event={"ID":"a8800c73-8457-4de9-8473-aa5b62d40811","Type":"ContainerStarted","Data":"dc8633f0ab10bd1df79cbaf75b0c1832ce109bbe3474e87d73a66f29b21947e7"} Apr 16 18:20:33.375548 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:33.375468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" event={"ID":"a8800c73-8457-4de9-8473-aa5b62d40811","Type":"ContainerStarted","Data":"001c3bc2936533331fc14f522cfac05931468e1db77d6a30d395933b8f4e92a5"} Apr 16 18:20:33.375902 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:33.375692 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" Apr 16 18:20:33.384385 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:33.381916 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" Apr 16 18:20:33.394193 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:33.394152 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-mh224" podStartSLOduration=1.397910841 podStartE2EDuration="2.394140288s" podCreationTimestamp="2026-04-16 18:20:31 +0000 UTC" firstStartedPulling="2026-04-16 18:20:32.070757717 +0000 UTC m=+165.850222029" lastFinishedPulling="2026-04-16 18:20:33.066987154 +0000 UTC m=+166.846451476" observedRunningTime="2026-04-16 18:20:33.39381724 +0000 UTC m=+167.173281572" watchObservedRunningTime="2026-04-16 18:20:33.394140288 +0000 UTC m=+167.173604618" Apr 16 18:20:35.834349 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:35.834307 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:20:38.105984 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.105952 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-9f44m"] Apr 16 18:20:38.109387 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.109361 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.112159 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.112138 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:20:38.113299 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.113279 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:20:38.113407 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.113300 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8lr4j\"" Apr 16 18:20:38.113407 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.113279 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:20:38.116604 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.116576 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7cfsz"] Apr 16 18:20:38.118847 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.118821 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.120783 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.120094 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-9f44m"] Apr 16 18:20:38.120783 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.120337 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv5fn\" (UniqueName: \"kubernetes.io/projected/0432dce1-45c7-4680-9444-34c004ae03cb-kube-api-access-zv5fn\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.120783 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.120396 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0432dce1-45c7-4680-9444-34c004ae03cb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.120783 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.120426 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0432dce1-45c7-4680-9444-34c004ae03cb-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.120783 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.120484 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0432dce1-45c7-4680-9444-34c004ae03cb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.121775 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.121752 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-w7vqp\"" Apr 16 18:20:38.121975 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.121956 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:20:38.122269 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.122250 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:20:38.122527 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.122508 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:20:38.221356 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0432dce1-45c7-4680-9444-34c004ae03cb-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.221544 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-tls\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.221544 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-textfile\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.221544 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221437 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/027aad74-c11c-4a49-8925-52c728463d0f-metrics-client-ca\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.221544 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221480 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-wtmp\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.221544 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221505 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-accelerators-collector-config\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.221544 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221530 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln65s\" (UniqueName: \"kubernetes.io/projected/027aad74-c11c-4a49-8925-52c728463d0f-kube-api-access-ln65s\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.221847 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-root\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.221847 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221639 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.221847 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221679 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0432dce1-45c7-4680-9444-34c004ae03cb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.221847 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv5fn\" (UniqueName: \"kubernetes.io/projected/0432dce1-45c7-4680-9444-34c004ae03cb-kube-api-access-zv5fn\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.221847 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-sys\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.222108 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.221866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0432dce1-45c7-4680-9444-34c004ae03cb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.222206 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.222186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0432dce1-45c7-4680-9444-34c004ae03cb-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.224169 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.224137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0432dce1-45c7-4680-9444-34c004ae03cb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.224413 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.224395 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0432dce1-45c7-4680-9444-34c004ae03cb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.232145 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.232112 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv5fn\" (UniqueName: \"kubernetes.io/projected/0432dce1-45c7-4680-9444-34c004ae03cb-kube-api-access-zv5fn\") pod \"openshift-state-metrics-5669946b84-9f44m\" (UID: \"0432dce1-45c7-4680-9444-34c004ae03cb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.322987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.322953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-tls\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323146 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-textfile\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323146 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/027aad74-c11c-4a49-8925-52c728463d0f-metrics-client-ca\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323146 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-wtmp\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323307 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-accelerators-collector-config\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323307 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln65s\" (UniqueName: \"kubernetes.io/projected/027aad74-c11c-4a49-8925-52c728463d0f-kube-api-access-ln65s\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323307 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-root\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323307 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323307 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323290 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-sys\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323311 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-wtmp\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323360 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-sys\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323373 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-textfile\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323410 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/027aad74-c11c-4a49-8925-52c728463d0f-root\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.323777 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.323757 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/027aad74-c11c-4a49-8925-52c728463d0f-metrics-client-ca\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.324460 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.324437 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-accelerators-collector-config\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.325501 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.325474 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-tls\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.325817 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.325797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/027aad74-c11c-4a49-8925-52c728463d0f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.335373 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.335353 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln65s\" (UniqueName: \"kubernetes.io/projected/027aad74-c11c-4a49-8925-52c728463d0f-kube-api-access-ln65s\") pod \"node-exporter-7cfsz\" (UID: \"027aad74-c11c-4a49-8925-52c728463d0f\") " pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.422105 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.422010 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" Apr 16 18:20:38.431879 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.431857 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7cfsz" Apr 16 18:20:38.440166 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:38.440137 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027aad74_c11c_4a49_8925_52c728463d0f.slice/crio-7fcbd563907895d9f544002f21d2dc6b45e951bcf02d7099039ebf59fb32eeac WatchSource:0}: Error finding container 7fcbd563907895d9f544002f21d2dc6b45e951bcf02d7099039ebf59fb32eeac: Status 404 returned error can't find the container with id 7fcbd563907895d9f544002f21d2dc6b45e951bcf02d7099039ebf59fb32eeac Apr 16 18:20:38.551828 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:38.551738 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-9f44m"] Apr 16 18:20:38.554627 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:38.554600 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0432dce1_45c7_4680_9444_34c004ae03cb.slice/crio-97891187ba7622c2fafb7b3b0e8b9d612292e5c9a90a98362d9c2743cbcc2c3e WatchSource:0}: Error finding container 97891187ba7622c2fafb7b3b0e8b9d612292e5c9a90a98362d9c2743cbcc2c3e: Status 404 returned error can't find the container with id 97891187ba7622c2fafb7b3b0e8b9d612292e5c9a90a98362d9c2743cbcc2c3e Apr 16 18:20:39.171431 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.171334 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:39.175509 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.175265 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.178125 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178099 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:20:39.178242 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178130 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6tmsv\"" Apr 16 18:20:39.178384 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178367 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:20:39.178448 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178383 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:20:39.178448 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178400 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:20:39.178648 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178632 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:20:39.178708 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178648 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:20:39.178814 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178795 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:20:39.179004 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.178877 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:20:39.186833 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.186812 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:20:39.187488 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.187428 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:39.229764 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.229715 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.229764 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.229761 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.229993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.229839 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.229993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.229867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.229993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.229893 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5chm\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-kube-api-access-q5chm\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.229993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.229930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.229993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.229961 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-config-out\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.229993 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.229987 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-web-config\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.230319 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.230140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.230319 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.230174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.230319 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.230214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-config-volume\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.230319 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.230242 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.230319 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.230303 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331449 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331419 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331563 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331563 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331524 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331666 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331553 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331666 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331666 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331621 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5chm\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-kube-api-access-q5chm\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331809 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331666 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331809 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331700 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-config-out\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331809 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-web-config\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331809 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331766 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.331809 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.332040 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331832 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-config-volume\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.332040 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.331861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.332660 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:39.332635 2570 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:20:39.332774 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:39.332707 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls podName:8ab89ed0-19d8-40ee-8801-b4f260443290 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:39.832687389 +0000 UTC m=+173.612151701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290") : secret "alertmanager-main-tls" not found Apr 16 18:20:39.335192 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.334404 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.336341 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.336115 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.336341 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.336155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.336584 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.336539 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.337851 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.337829 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-config-out\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.337948 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.337870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-config-volume\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.338458 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.338416 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.339252 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.339228 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-web-config\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.339517 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.339470 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.340234 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.340203 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.340916 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.340897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.342940 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.342918 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5chm\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-kube-api-access-q5chm\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.396780 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.396399 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" event={"ID":"0432dce1-45c7-4680-9444-34c004ae03cb","Type":"ContainerStarted","Data":"0c7f0d6ecf167d1fd884a927db8a61ccedd4789525c86d4bb463322250609d4f"} Apr 16 18:20:39.396780 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.396443 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" event={"ID":"0432dce1-45c7-4680-9444-34c004ae03cb","Type":"ContainerStarted","Data":"3b01a3dac2114889f54b8abd715959433d3260d7d716c08db5854669209cf9e3"} Apr 16 18:20:39.396780 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.396457 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" event={"ID":"0432dce1-45c7-4680-9444-34c004ae03cb","Type":"ContainerStarted","Data":"97891187ba7622c2fafb7b3b0e8b9d612292e5c9a90a98362d9c2743cbcc2c3e"} Apr 16 18:20:39.398046 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.398021 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cfsz" event={"ID":"027aad74-c11c-4a49-8925-52c728463d0f","Type":"ContainerStarted","Data":"3e7d957def7a657c5cdc93fee6482117d29097eb5962d62972fca222cea8cf0f"} Apr 16 18:20:39.398175 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.398076 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cfsz" event={"ID":"027aad74-c11c-4a49-8925-52c728463d0f","Type":"ContainerStarted","Data":"7fcbd563907895d9f544002f21d2dc6b45e951bcf02d7099039ebf59fb32eeac"} Apr 16 18:20:39.835834 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.835801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:39.838353 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:39.838326 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:40.090822 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.090726 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:20:40.170027 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.169993 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5979894575-mh9cx"] Apr 16 18:20:40.175672 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.174800 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.178385 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.178356 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:20:40.178576 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.178542 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wt5h5\"" Apr 16 18:20:40.178679 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.178617 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:20:40.178679 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.178644 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:20:40.180534 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.180302 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:20:40.180534 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.180357 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-jbitih90ji6b\"" Apr 16 18:20:40.180534 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.180472 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:20:40.186229 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.186210 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5979894575-mh9cx"] Apr 16 18:20:40.218404 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.218375 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:20:40.221753 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:40.221725 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab89ed0_19d8_40ee_8801_b4f260443290.slice/crio-8a06e17d2423023d8a81131ba03ff03839ab7d31e799e5d90cd81ae0c76f1930 WatchSource:0}: Error finding container 8a06e17d2423023d8a81131ba03ff03839ab7d31e799e5d90cd81ae0c76f1930: Status 404 returned error can't find the container with id 8a06e17d2423023d8a81131ba03ff03839ab7d31e799e5d90cd81ae0c76f1930 Apr 16 18:20:40.239575 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.239550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.239699 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.239586 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpvpc\" (UniqueName: \"kubernetes.io/projected/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-kube-api-access-vpvpc\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.239699 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.239624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-metrics-client-ca\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.239777 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.239722 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-tls\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.239812 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.239775 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.239853 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.239809 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.239889 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.239866 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.239926 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.239894 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-grpc-tls\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.341006 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.340920 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.341006 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.340962 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-grpc-tls\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.341006 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.340986 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.341268 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.341016 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpvpc\" (UniqueName: \"kubernetes.io/projected/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-kube-api-access-vpvpc\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.341268 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.341093 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-metrics-client-ca\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.341268 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.341141 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-tls\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.341268 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.341177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.341451 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.341403 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.342034 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.342004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-metrics-client-ca\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.344077 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.344035 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.344190 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.344035 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.344305 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.344284 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.344356 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.344296 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-tls\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.344356 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.344338 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-grpc-tls\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.344525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.344509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.350825 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.350797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpvpc\" (UniqueName: \"kubernetes.io/projected/620fdae5-a00a-4876-8ec2-3b53d29cdfcb-kube-api-access-vpvpc\") pod \"thanos-querier-5979894575-mh9cx\" (UID: \"620fdae5-a00a-4876-8ec2-3b53d29cdfcb\") " pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.369005 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.368983 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-brklz" Apr 16 18:20:40.402898 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.402866 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" event={"ID":"0432dce1-45c7-4680-9444-34c004ae03cb","Type":"ContainerStarted","Data":"479bc68ba0ddbd59821a9bf35354e3fc04b2889247e06685e304ad2f3db03a7f"} Apr 16 18:20:40.403916 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.403891 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerStarted","Data":"8a06e17d2423023d8a81131ba03ff03839ab7d31e799e5d90cd81ae0c76f1930"} Apr 16 18:20:40.405027 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.405004 2570 generic.go:358] "Generic (PLEG): container finished" podID="027aad74-c11c-4a49-8925-52c728463d0f" containerID="3e7d957def7a657c5cdc93fee6482117d29097eb5962d62972fca222cea8cf0f" exitCode=0 Apr 16 18:20:40.405138 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.405069 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cfsz" event={"ID":"027aad74-c11c-4a49-8925-52c728463d0f","Type":"ContainerDied","Data":"3e7d957def7a657c5cdc93fee6482117d29097eb5962d62972fca222cea8cf0f"} Apr 16 18:20:40.424790 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.424734 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9f44m" podStartSLOduration=1.433263101 podStartE2EDuration="2.424714193s" podCreationTimestamp="2026-04-16 18:20:38 +0000 UTC" firstStartedPulling="2026-04-16 18:20:38.663012185 +0000 UTC m=+172.442476509" lastFinishedPulling="2026-04-16 18:20:39.65446329 +0000 UTC m=+173.433927601" observedRunningTime="2026-04-16 18:20:40.422449495 +0000 UTC m=+174.201913826" watchObservedRunningTime="2026-04-16 18:20:40.424714193 +0000 UTC m=+174.204178525" Apr 16 18:20:40.487912 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.487890 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:40.623462 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:40.623363 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5979894575-mh9cx"] Apr 16 18:20:40.628083 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:40.628032 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod620fdae5_a00a_4876_8ec2_3b53d29cdfcb.slice/crio-099bc47af5228a3123f18102984a0d04d07a67de6c34a07a827a2ae1d85487b9 WatchSource:0}: Error finding container 099bc47af5228a3123f18102984a0d04d07a67de6c34a07a827a2ae1d85487b9: Status 404 returned error can't find the container with id 099bc47af5228a3123f18102984a0d04d07a67de6c34a07a827a2ae1d85487b9 Apr 16 18:20:41.409326 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:41.409249 2570 generic.go:358] "Generic (PLEG): container finished" podID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerID="e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce" exitCode=0 Apr 16 18:20:41.409726 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:41.409326 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerDied","Data":"e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce"} Apr 16 18:20:41.411356 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:41.411327 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cfsz" event={"ID":"027aad74-c11c-4a49-8925-52c728463d0f","Type":"ContainerStarted","Data":"91f732da9ba69ad55a4b424a2f699c14266997f5c2947eaef175d2fc07615479"} Apr 16 18:20:41.411457 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:41.411368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7cfsz" event={"ID":"027aad74-c11c-4a49-8925-52c728463d0f","Type":"ContainerStarted","Data":"108a9dc94fd5a2f46ee432e538d82827ecfdf86e77843c14c2b4bc2efc936dcb"} Apr 16 18:20:41.412474 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:41.412454 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" event={"ID":"620fdae5-a00a-4876-8ec2-3b53d29cdfcb","Type":"ContainerStarted","Data":"099bc47af5228a3123f18102984a0d04d07a67de6c34a07a827a2ae1d85487b9"} Apr 16 18:20:42.858458 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:42.858387 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7cfsz" podStartSLOduration=4.039553445 podStartE2EDuration="4.858369434s" podCreationTimestamp="2026-04-16 18:20:38 +0000 UTC" firstStartedPulling="2026-04-16 18:20:38.442206041 +0000 UTC m=+172.221670365" lastFinishedPulling="2026-04-16 18:20:39.261022024 +0000 UTC m=+173.040486354" observedRunningTime="2026-04-16 18:20:41.462601407 +0000 UTC m=+175.242065729" watchObservedRunningTime="2026-04-16 18:20:42.858369434 +0000 UTC m=+176.637833767" Apr 16 18:20:42.859926 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:42.859900 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k"] Apr 16 18:20:42.862574 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:42.862547 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" Apr 16 18:20:42.865941 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:42.865917 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:20:42.866081 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:42.865925 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-ms29m\"" Apr 16 18:20:42.873285 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:42.873260 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k"] Apr 16 18:20:42.965390 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:42.965357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-ckt6k\" (UID: \"9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" Apr 16 18:20:43.065883 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.065841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-ckt6k\" (UID: \"9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" Apr 16 18:20:43.066064 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:43.066011 2570 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:20:43.066131 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:20:43.066102 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36-monitoring-plugin-cert podName:9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:43.566079972 +0000 UTC m=+177.345544288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-ckt6k" (UID: "9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36") : secret "monitoring-plugin-cert" not found Apr 16 18:20:43.338939 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.338912 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7"] Apr 16 18:20:43.342338 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.342216 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.345370 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.345347 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:20:43.345537 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.345356 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-rq5nq\"" Apr 16 18:20:43.345612 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.345377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:20:43.345612 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.345421 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:20:43.345711 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.345423 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:20:43.345711 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.345437 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:20:43.350411 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.350350 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:20:43.354779 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.354737 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7"] Apr 16 18:20:43.370220 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.370172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-telemeter-client-tls\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.370345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.370230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-serving-certs-ca-bundle\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.370345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.370268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmvbd\" (UniqueName: \"kubernetes.io/projected/ae7b62ca-8697-4148-bd62-b26981e1514f-kube-api-access-xmvbd\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.370345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.370341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.370538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.370370 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-federate-client-tls\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.370538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.370445 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-metrics-client-ca\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.370538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.370485 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.370661 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.370567 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-secret-telemeter-client\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.420855 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.420826 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerStarted","Data":"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5"} Apr 16 18:20:43.420957 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.420864 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerStarted","Data":"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c"} Apr 16 18:20:43.420957 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.420878 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerStarted","Data":"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8"} Apr 16 18:20:43.422869 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.422831 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" event={"ID":"620fdae5-a00a-4876-8ec2-3b53d29cdfcb","Type":"ContainerStarted","Data":"264b709ad5e042d0b25f585f4533e70eb73e2442652755740eb771d4d8bd7f81"} Apr 16 18:20:43.422869 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.422860 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" event={"ID":"620fdae5-a00a-4876-8ec2-3b53d29cdfcb","Type":"ContainerStarted","Data":"5604ce29f7574331ea1e5a7cd9721efed1f0cf3c1c5a68f53fecb294d5119234"} Apr 16 18:20:43.422992 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.422875 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" event={"ID":"620fdae5-a00a-4876-8ec2-3b53d29cdfcb","Type":"ContainerStarted","Data":"e66b7d12d775b50a1751c0ba4d78905e16d057b68314193761551bc7993f3818"} Apr 16 18:20:43.472401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.471755 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-serving-certs-ca-bundle\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.472401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.471800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmvbd\" (UniqueName: \"kubernetes.io/projected/ae7b62ca-8697-4148-bd62-b26981e1514f-kube-api-access-xmvbd\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.472401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.471844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.472401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.471872 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-federate-client-tls\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.472401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.471910 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-metrics-client-ca\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.472401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.471941 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.472401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.471994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-secret-telemeter-client\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.472401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.472047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-telemeter-client-tls\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.473125 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.473003 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.473230 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.473187 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-serving-certs-ca-bundle\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.473452 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.473433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae7b62ca-8697-4148-bd62-b26981e1514f-metrics-client-ca\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.475002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.474887 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-telemeter-client-tls\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.475871 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.475804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-secret-telemeter-client\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.475952 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.475939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-federate-client-tls\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.476225 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.476205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae7b62ca-8697-4148-bd62-b26981e1514f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.481066 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.481033 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmvbd\" (UniqueName: \"kubernetes.io/projected/ae7b62ca-8697-4148-bd62-b26981e1514f-kube-api-access-xmvbd\") pod \"telemeter-client-fdc9b6c58-ll8c7\" (UID: \"ae7b62ca-8697-4148-bd62-b26981e1514f\") " pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.573182 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.573150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-ckt6k\" (UID: \"9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" Apr 16 18:20:43.575441 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.575421 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-ckt6k\" (UID: \"9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" Apr 16 18:20:43.661975 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.661944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" Apr 16 18:20:43.774217 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.773958 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" Apr 16 18:20:43.798327 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.798151 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7"] Apr 16 18:20:43.918628 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:43.918596 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k"] Apr 16 18:20:43.921589 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:43.921524 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db2e2f6_0f7a_432a_b3c2_72b5f4e3be36.slice/crio-fdc32bd35944b81d2b4841efcc6a061b5633f2ab0c123206a4630e99604c6c91 WatchSource:0}: Error finding container fdc32bd35944b81d2b4841efcc6a061b5633f2ab0c123206a4630e99604c6c91: Status 404 returned error can't find the container with id fdc32bd35944b81d2b4841efcc6a061b5633f2ab0c123206a4630e99604c6c91 Apr 16 18:20:44.429742 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.429618 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerStarted","Data":"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb"} Apr 16 18:20:44.429742 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.429659 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerStarted","Data":"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1"} Apr 16 18:20:44.429742 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.429674 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerStarted","Data":"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de"} Apr 16 18:20:44.432884 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.432855 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" event={"ID":"620fdae5-a00a-4876-8ec2-3b53d29cdfcb","Type":"ContainerStarted","Data":"0cc5a6f96b604652a1f6228dd061c674273be5eebef0be8ab09bd36a975d7add"} Apr 16 18:20:44.433014 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.432890 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" event={"ID":"620fdae5-a00a-4876-8ec2-3b53d29cdfcb","Type":"ContainerStarted","Data":"c606c0e70776d7f0e9c46e596a05ada56aa862a59493be32a46c1e9bba9ab6b5"} Apr 16 18:20:44.433014 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.432905 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" event={"ID":"620fdae5-a00a-4876-8ec2-3b53d29cdfcb","Type":"ContainerStarted","Data":"d930a283bd43442e6b241d3a76954073d14b06ca2579786e956e9b969b54e372"} Apr 16 18:20:44.433148 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.433029 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:44.434618 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.434582 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" event={"ID":"ae7b62ca-8697-4148-bd62-b26981e1514f","Type":"ContainerStarted","Data":"2cfe23f9746e3ef207d5bc3800c9ded579d4477bbf7028b6592b5b98038ca99f"} Apr 16 18:20:44.435761 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.435729 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" event={"ID":"9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36","Type":"ContainerStarted","Data":"fdc32bd35944b81d2b4841efcc6a061b5633f2ab0c123206a4630e99604c6c91"} Apr 16 18:20:44.462100 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.462034 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.597729324 podStartE2EDuration="5.462018378s" podCreationTimestamp="2026-04-16 18:20:39 +0000 UTC" firstStartedPulling="2026-04-16 18:20:40.223691086 +0000 UTC m=+174.003155395" lastFinishedPulling="2026-04-16 18:20:44.087980135 +0000 UTC m=+177.867444449" observedRunningTime="2026-04-16 18:20:44.460286911 +0000 UTC m=+178.239751244" watchObservedRunningTime="2026-04-16 18:20:44.462018378 +0000 UTC m=+178.241482712" Apr 16 18:20:44.497184 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:44.495445 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" podStartSLOduration=1.039050851 podStartE2EDuration="4.495428231s" podCreationTimestamp="2026-04-16 18:20:40 +0000 UTC" firstStartedPulling="2026-04-16 18:20:40.630249609 +0000 UTC m=+174.409713922" lastFinishedPulling="2026-04-16 18:20:44.086626976 +0000 UTC m=+177.866091302" observedRunningTime="2026-04-16 18:20:44.494733299 +0000 UTC m=+178.274197631" watchObservedRunningTime="2026-04-16 18:20:44.495428231 +0000 UTC m=+178.274892562" Apr 16 18:20:46.444284 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:46.444242 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" event={"ID":"ae7b62ca-8697-4148-bd62-b26981e1514f","Type":"ContainerStarted","Data":"b2696a041a0e8c4b719b2937c6a893aa18db79ca3cc078b1f0c206b927d08e53"} Apr 16 18:20:46.444284 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:46.444280 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" event={"ID":"ae7b62ca-8697-4148-bd62-b26981e1514f","Type":"ContainerStarted","Data":"a1a102712983b61d31dc4564e3209582abbcc98fcd323446f612a0a2614f6182"} Apr 16 18:20:46.444284 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:46.444289 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" event={"ID":"ae7b62ca-8697-4148-bd62-b26981e1514f","Type":"ContainerStarted","Data":"21fa9b53b4aaafe241734234486db6c9c14f26acff90a3d0cbb6ed37bb577ff4"} Apr 16 18:20:46.445532 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:46.445502 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" event={"ID":"9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36","Type":"ContainerStarted","Data":"ffc72245ff28c5a3a5363fe3b3db3e94e6c7612c58e85898d15249c34dbcd43b"} Apr 16 18:20:46.445739 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:46.445725 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" Apr 16 18:20:46.450362 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:46.450344 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" Apr 16 18:20:46.472447 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:46.472402 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-fdc9b6c58-ll8c7" podStartSLOduration=1.735212344 podStartE2EDuration="3.472391223s" podCreationTimestamp="2026-04-16 18:20:43 +0000 UTC" firstStartedPulling="2026-04-16 18:20:43.804471189 +0000 UTC m=+177.583935498" lastFinishedPulling="2026-04-16 18:20:45.541650063 +0000 UTC m=+179.321114377" observedRunningTime="2026-04-16 18:20:46.471286931 +0000 UTC m=+180.250751291" watchObservedRunningTime="2026-04-16 18:20:46.472391223 +0000 UTC m=+180.251855554" Apr 16 18:20:46.488039 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:46.487997 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-ckt6k" podStartSLOduration=2.873581753 podStartE2EDuration="4.487984831s" podCreationTimestamp="2026-04-16 18:20:42 +0000 UTC" firstStartedPulling="2026-04-16 18:20:43.923744736 +0000 UTC m=+177.703209045" lastFinishedPulling="2026-04-16 18:20:45.538147797 +0000 UTC m=+179.317612123" observedRunningTime="2026-04-16 18:20:46.48752067 +0000 UTC m=+180.266985002" watchObservedRunningTime="2026-04-16 18:20:46.487984831 +0000 UTC m=+180.267449162" Apr 16 18:20:50.445499 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:50.445470 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5979894575-mh9cx" Apr 16 18:20:57.886736 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:57.886698 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-p2sjs"] Apr 16 18:20:57.888878 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:57.888862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-p2sjs" Apr 16 18:20:57.892732 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:57.892713 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:20:57.893137 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:57.893122 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:20:57.897152 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:57.897122 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-sgcf2\"" Apr 16 18:20:57.905728 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:57.905705 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-p2sjs"] Apr 16 18:20:57.997639 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:57.997606 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drzlc\" (UniqueName: \"kubernetes.io/projected/b1579b0c-9f23-4da7-b7d8-a42454fa0e06-kube-api-access-drzlc\") pod \"downloads-586b57c7b4-p2sjs\" (UID: \"b1579b0c-9f23-4da7-b7d8-a42454fa0e06\") " pod="openshift-console/downloads-586b57c7b4-p2sjs" Apr 16 18:20:58.098050 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:58.097971 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drzlc\" (UniqueName: \"kubernetes.io/projected/b1579b0c-9f23-4da7-b7d8-a42454fa0e06-kube-api-access-drzlc\") pod \"downloads-586b57c7b4-p2sjs\" (UID: \"b1579b0c-9f23-4da7-b7d8-a42454fa0e06\") " pod="openshift-console/downloads-586b57c7b4-p2sjs" Apr 16 18:20:58.107850 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:58.107823 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drzlc\" (UniqueName: \"kubernetes.io/projected/b1579b0c-9f23-4da7-b7d8-a42454fa0e06-kube-api-access-drzlc\") pod \"downloads-586b57c7b4-p2sjs\" (UID: \"b1579b0c-9f23-4da7-b7d8-a42454fa0e06\") " pod="openshift-console/downloads-586b57c7b4-p2sjs" Apr 16 18:20:58.197416 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:58.197339 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-p2sjs" Apr 16 18:20:58.316106 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:58.316078 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-p2sjs"] Apr 16 18:20:58.318786 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:20:58.318759 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1579b0c_9f23_4da7_b7d8_a42454fa0e06.slice/crio-801c135d22748c2507ca0164017bb2eddd73eee715a788ea92443a38535d66b0 WatchSource:0}: Error finding container 801c135d22748c2507ca0164017bb2eddd73eee715a788ea92443a38535d66b0: Status 404 returned error can't find the container with id 801c135d22748c2507ca0164017bb2eddd73eee715a788ea92443a38535d66b0 Apr 16 18:20:58.480704 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:20:58.480621 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-p2sjs" event={"ID":"b1579b0c-9f23-4da7-b7d8-a42454fa0e06","Type":"ContainerStarted","Data":"801c135d22748c2507ca0164017bb2eddd73eee715a788ea92443a38535d66b0"} Apr 16 18:21:04.327997 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.327961 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-797664449f-k4ssj"] Apr 16 18:21:04.333272 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.333243 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.335999 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.335905 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:21:04.335999 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.335921 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:21:04.336214 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.336043 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:21:04.337609 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.337386 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qgprb\"" Apr 16 18:21:04.337609 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.337495 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:21:04.337609 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.337514 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:21:04.347046 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.347025 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-797664449f-k4ssj"] Apr 16 18:21:04.458824 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.458789 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx59h\" (UniqueName: \"kubernetes.io/projected/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-kube-api-access-bx59h\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.458996 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.458834 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-config\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.458996 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.458944 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-serving-cert\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.459134 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.459012 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-oauth-serving-cert\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.459134 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.459076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-oauth-config\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.459134 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.459119 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-service-ca\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.559951 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.559914 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-oauth-serving-cert\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.560150 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.559978 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-oauth-config\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.560150 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.560005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-service-ca\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.560150 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.560047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx59h\" (UniqueName: \"kubernetes.io/projected/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-kube-api-access-bx59h\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.560150 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.560105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-config\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.560361 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.560161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-serving-cert\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.560878 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.560824 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-oauth-serving-cert\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.561013 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.560933 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-service-ca\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.561524 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.561505 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-config\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.563238 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.563214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-oauth-config\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.563332 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.563222 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-serving-cert\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.569667 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.569645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx59h\" (UniqueName: \"kubernetes.io/projected/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-kube-api-access-bx59h\") pod \"console-797664449f-k4ssj\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.644913 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.644833 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:04.779817 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:04.779787 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-797664449f-k4ssj"] Apr 16 18:21:04.782499 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:21:04.782466 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8ac4aa8_1c9b_49e1_8829_d18685a168d1.slice/crio-aacfd19307f66ca0a5658b3b3bb84a876303ca7f02446b8bfd1f7eb6ece189ee WatchSource:0}: Error finding container aacfd19307f66ca0a5658b3b3bb84a876303ca7f02446b8bfd1f7eb6ece189ee: Status 404 returned error can't find the container with id aacfd19307f66ca0a5658b3b3bb84a876303ca7f02446b8bfd1f7eb6ece189ee Apr 16 18:21:05.511331 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:05.511276 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797664449f-k4ssj" event={"ID":"e8ac4aa8-1c9b-49e1-8829-d18685a168d1","Type":"ContainerStarted","Data":"aacfd19307f66ca0a5658b3b3bb84a876303ca7f02446b8bfd1f7eb6ece189ee"} Apr 16 18:21:08.522679 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:08.522646 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797664449f-k4ssj" event={"ID":"e8ac4aa8-1c9b-49e1-8829-d18685a168d1","Type":"ContainerStarted","Data":"b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4"} Apr 16 18:21:08.548817 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:08.548765 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-797664449f-k4ssj" podStartSLOduration=1.5952908670000001 podStartE2EDuration="4.548749987s" podCreationTimestamp="2026-04-16 18:21:04 +0000 UTC" firstStartedPulling="2026-04-16 18:21:04.784640662 +0000 UTC m=+198.564104972" lastFinishedPulling="2026-04-16 18:21:07.738099779 +0000 UTC m=+201.517564092" observedRunningTime="2026-04-16 18:21:08.546368078 +0000 UTC m=+202.325832410" watchObservedRunningTime="2026-04-16 18:21:08.548749987 +0000 UTC m=+202.328214318" Apr 16 18:21:14.645860 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:14.645816 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:14.646348 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:14.645874 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:14.651598 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:14.651568 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:15.177286 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.177251 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7685656dcb-skmjz"] Apr 16 18:21:15.213861 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.213831 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7685656dcb-skmjz"] Apr 16 18:21:15.214042 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.213965 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.222665 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.222472 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:21:15.355412 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.355374 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbn5k\" (UniqueName: \"kubernetes.io/projected/7163fad7-77d3-43e7-9464-6752911355f3-kube-api-access-gbn5k\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.355600 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.355439 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-oauth-config\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.355600 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.355500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-console-config\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.355600 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.355565 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-serving-cert\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.355774 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.355614 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-trusted-ca-bundle\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.355774 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.355637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-oauth-serving-cert\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.355774 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.355669 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-service-ca\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.456647 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.456557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbn5k\" (UniqueName: \"kubernetes.io/projected/7163fad7-77d3-43e7-9464-6752911355f3-kube-api-access-gbn5k\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.456647 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.456618 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-oauth-config\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.456874 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.456677 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-console-config\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.456874 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.456808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-serving-cert\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.456874 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.456843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-trusted-ca-bundle\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.456874 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.456867 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-oauth-serving-cert\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.457115 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.456907 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-service-ca\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.457561 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.457528 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-console-config\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.457709 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.457681 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-oauth-serving-cert\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.457792 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.457764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-service-ca\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.458675 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.458600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-trusted-ca-bundle\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.459707 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.459685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-oauth-config\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.459878 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.459852 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-serving-cert\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.465778 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.465758 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbn5k\" (UniqueName: \"kubernetes.io/projected/7163fad7-77d3-43e7-9464-6752911355f3-kube-api-access-gbn5k\") pod \"console-7685656dcb-skmjz\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.526972 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.526932 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:15.555872 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.555833 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:15.877369 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:15.877338 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7685656dcb-skmjz"] Apr 16 18:21:15.985802 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:21:15.985764 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7163fad7_77d3_43e7_9464_6752911355f3.slice/crio-961229caead4176c3fb35f4ba3df80715816d39e48b3d68c7d761adbeb44af7f WatchSource:0}: Error finding container 961229caead4176c3fb35f4ba3df80715816d39e48b3d68c7d761adbeb44af7f: Status 404 returned error can't find the container with id 961229caead4176c3fb35f4ba3df80715816d39e48b3d68c7d761adbeb44af7f Apr 16 18:21:16.556251 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:16.556214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7685656dcb-skmjz" event={"ID":"7163fad7-77d3-43e7-9464-6752911355f3","Type":"ContainerStarted","Data":"c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00"} Apr 16 18:21:16.556496 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:16.556473 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7685656dcb-skmjz" event={"ID":"7163fad7-77d3-43e7-9464-6752911355f3","Type":"ContainerStarted","Data":"961229caead4176c3fb35f4ba3df80715816d39e48b3d68c7d761adbeb44af7f"} Apr 16 18:21:16.557733 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:16.557707 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-p2sjs" event={"ID":"b1579b0c-9f23-4da7-b7d8-a42454fa0e06","Type":"ContainerStarted","Data":"71193be82cb988447f4d7eda879a5b3738a88ee8ed8fdb647c4057ad6ede8d33"} Apr 16 18:21:16.578686 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:16.578638 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7685656dcb-skmjz" podStartSLOduration=1.578623344 podStartE2EDuration="1.578623344s" podCreationTimestamp="2026-04-16 18:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:21:16.577032343 +0000 UTC m=+210.356496674" watchObservedRunningTime="2026-04-16 18:21:16.578623344 +0000 UTC m=+210.358087653" Apr 16 18:21:16.596180 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:16.596109 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-p2sjs" podStartSLOduration=1.861955911 podStartE2EDuration="19.596092039s" podCreationTimestamp="2026-04-16 18:20:57 +0000 UTC" firstStartedPulling="2026-04-16 18:20:58.320558231 +0000 UTC m=+192.100022540" lastFinishedPulling="2026-04-16 18:21:16.054694353 +0000 UTC m=+209.834158668" observedRunningTime="2026-04-16 18:21:16.595875583 +0000 UTC m=+210.375339915" watchObservedRunningTime="2026-04-16 18:21:16.596092039 +0000 UTC m=+210.375556370" Apr 16 18:21:17.561104 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:17.561072 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-p2sjs" Apr 16 18:21:17.577305 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:17.577279 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-p2sjs" Apr 16 18:21:21.575026 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:21.574989 2570 generic.go:358] "Generic (PLEG): container finished" podID="5fa9d23f-acec-46e2-b6bc-3203fdd2764d" containerID="f5c15869671c33eaa5ca68d0a51f672b5322b43713ca964db3a10d9b76004ed4" exitCode=0 Apr 16 18:21:21.575539 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:21.575084 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" event={"ID":"5fa9d23f-acec-46e2-b6bc-3203fdd2764d","Type":"ContainerDied","Data":"f5c15869671c33eaa5ca68d0a51f672b5322b43713ca964db3a10d9b76004ed4"} Apr 16 18:21:21.575539 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:21.575489 2570 scope.go:117] "RemoveContainer" containerID="f5c15869671c33eaa5ca68d0a51f672b5322b43713ca964db3a10d9b76004ed4" Apr 16 18:21:22.581097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:22.581043 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wk6td" event={"ID":"5fa9d23f-acec-46e2-b6bc-3203fdd2764d","Type":"ContainerStarted","Data":"f6a01a76ac002548f90c49ddee05ee074a541fe98a8af6b6cf0723ec38ef80c1"} Apr 16 18:21:25.527593 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:25.527558 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:25.527593 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:25.527598 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:25.532156 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:25.532132 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:25.593981 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:25.593955 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:21:25.650723 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:25.650692 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-797664449f-k4ssj"] Apr 16 18:21:30.613121 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:30.613033 2570 generic.go:358] "Generic (PLEG): container finished" podID="39dcd2dc-e628-49b2-bd5e-aef8fe6aa083" containerID="5aed8d8b9cbe72b34395a90a94041455cc450413cb644a7969308b0fd0116468" exitCode=0 Apr 16 18:21:30.613645 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:30.613101 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" event={"ID":"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083","Type":"ContainerDied","Data":"5aed8d8b9cbe72b34395a90a94041455cc450413cb644a7969308b0fd0116468"} Apr 16 18:21:30.614163 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:30.614143 2570 scope.go:117] "RemoveContainer" containerID="5aed8d8b9cbe72b34395a90a94041455cc450413cb644a7969308b0fd0116468" Apr 16 18:21:31.618119 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:31.618086 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-nq9t5" event={"ID":"39dcd2dc-e628-49b2-bd5e-aef8fe6aa083","Type":"ContainerStarted","Data":"824445bcd58c3983ad3c5571af10b92f3dde5b2c74380b0d66a7eee9e757312f"} Apr 16 18:21:50.677121 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:50.677066 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-797664449f-k4ssj" podUID="e8ac4aa8-1c9b-49e1-8829-d18685a168d1" containerName="console" containerID="cri-o://b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4" gracePeriod=15 Apr 16 18:21:50.939397 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:50.939373 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-797664449f-k4ssj_e8ac4aa8-1c9b-49e1-8829-d18685a168d1/console/0.log" Apr 16 18:21:50.939526 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:50.939443 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:51.082509 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.082475 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-oauth-serving-cert\") pod \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " Apr 16 18:21:51.082706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.082539 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-oauth-config\") pod \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " Apr 16 18:21:51.082706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.082657 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-config\") pod \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " Apr 16 18:21:51.082706 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.082700 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-serving-cert\") pod \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " Apr 16 18:21:51.082868 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.082790 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx59h\" (UniqueName: \"kubernetes.io/projected/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-kube-api-access-bx59h\") pod \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " Apr 16 18:21:51.082868 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.082817 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-service-ca\") pod \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\" (UID: \"e8ac4aa8-1c9b-49e1-8829-d18685a168d1\") " Apr 16 18:21:51.082957 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.082831 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e8ac4aa8-1c9b-49e1-8829-d18685a168d1" (UID: "e8ac4aa8-1c9b-49e1-8829-d18685a168d1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:51.083046 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.083017 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-config" (OuterVolumeSpecName: "console-config") pod "e8ac4aa8-1c9b-49e1-8829-d18685a168d1" (UID: "e8ac4aa8-1c9b-49e1-8829-d18685a168d1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:51.083179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.083128 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-oauth-serving-cert\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:21:51.083179 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.083146 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-config\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:21:51.083257 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.083239 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-service-ca" (OuterVolumeSpecName: "service-ca") pod "e8ac4aa8-1c9b-49e1-8829-d18685a168d1" (UID: "e8ac4aa8-1c9b-49e1-8829-d18685a168d1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:51.084796 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.084772 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-kube-api-access-bx59h" (OuterVolumeSpecName: "kube-api-access-bx59h") pod "e8ac4aa8-1c9b-49e1-8829-d18685a168d1" (UID: "e8ac4aa8-1c9b-49e1-8829-d18685a168d1"). InnerVolumeSpecName "kube-api-access-bx59h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:51.084882 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.084803 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e8ac4aa8-1c9b-49e1-8829-d18685a168d1" (UID: "e8ac4aa8-1c9b-49e1-8829-d18685a168d1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:51.084882 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.084859 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e8ac4aa8-1c9b-49e1-8829-d18685a168d1" (UID: "e8ac4aa8-1c9b-49e1-8829-d18685a168d1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:51.183717 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.183642 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bx59h\" (UniqueName: \"kubernetes.io/projected/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-kube-api-access-bx59h\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:21:51.183717 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.183666 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-service-ca\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:21:51.183717 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.183675 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-oauth-config\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:21:51.183717 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.183684 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ac4aa8-1c9b-49e1-8829-d18685a168d1-console-serving-cert\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:21:51.679229 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.679199 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-797664449f-k4ssj_e8ac4aa8-1c9b-49e1-8829-d18685a168d1/console/0.log" Apr 16 18:21:51.679671 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.679242 2570 generic.go:358] "Generic (PLEG): container finished" podID="e8ac4aa8-1c9b-49e1-8829-d18685a168d1" containerID="b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4" exitCode=2 Apr 16 18:21:51.679671 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.679308 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-797664449f-k4ssj" Apr 16 18:21:51.679671 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.679313 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797664449f-k4ssj" event={"ID":"e8ac4aa8-1c9b-49e1-8829-d18685a168d1","Type":"ContainerDied","Data":"b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4"} Apr 16 18:21:51.679671 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.679350 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-797664449f-k4ssj" event={"ID":"e8ac4aa8-1c9b-49e1-8829-d18685a168d1","Type":"ContainerDied","Data":"aacfd19307f66ca0a5658b3b3bb84a876303ca7f02446b8bfd1f7eb6ece189ee"} Apr 16 18:21:51.679671 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.679372 2570 scope.go:117] "RemoveContainer" containerID="b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4" Apr 16 18:21:51.688392 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.688370 2570 scope.go:117] "RemoveContainer" containerID="b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4" Apr 16 18:21:51.688663 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:21:51.688644 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4\": container with ID starting with b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4 not found: ID does not exist" containerID="b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4" Apr 16 18:21:51.688717 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.688673 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4"} err="failed to get container status \"b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4\": rpc error: code = NotFound desc = could not find container \"b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4\": container with ID starting with b710ce72f5680c0ea470f8e0eb3e0c4345f376dc467609ae194aebf3fc258db4 not found: ID does not exist" Apr 16 18:21:51.703577 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.703550 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-797664449f-k4ssj"] Apr 16 18:21:51.708142 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:51.708121 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-797664449f-k4ssj"] Apr 16 18:21:52.838388 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:52.838353 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ac4aa8-1c9b-49e1-8829-d18685a168d1" path="/var/lib/kubelet/pods/e8ac4aa8-1c9b-49e1-8829-d18685a168d1/volumes" Apr 16 18:21:57.882426 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:57.882353 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59bdd647b4-kpczf"] Apr 16 18:21:57.882875 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:57.882761 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8ac4aa8-1c9b-49e1-8829-d18685a168d1" containerName="console" Apr 16 18:21:57.882875 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:57.882775 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ac4aa8-1c9b-49e1-8829-d18685a168d1" containerName="console" Apr 16 18:21:57.882875 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:57.882844 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8ac4aa8-1c9b-49e1-8829-d18685a168d1" containerName="console" Apr 16 18:21:57.896910 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:57.896886 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59bdd647b4-kpczf"] Apr 16 18:21:57.897021 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:57.896995 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.045452 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.045408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8kqc\" (UniqueName: \"kubernetes.io/projected/bddcb801-74e3-4f5a-b095-39274ec92ac0-kube-api-access-b8kqc\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.045618 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.045487 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-oauth-config\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.045618 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.045556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-service-ca\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.045618 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.045592 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-serving-cert\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.045618 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.045616 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-config\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.045824 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.045699 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-oauth-serving-cert\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.045824 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.045755 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-trusted-ca-bundle\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.146271 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.146191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8kqc\" (UniqueName: \"kubernetes.io/projected/bddcb801-74e3-4f5a-b095-39274ec92ac0-kube-api-access-b8kqc\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.146271 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.146253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-oauth-config\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.146464 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.146295 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-service-ca\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.146464 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.146329 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-serving-cert\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.146464 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.146351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-config\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.146464 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.146383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-oauth-serving-cert\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.146464 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.146422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-trusted-ca-bundle\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.147323 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.147292 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-service-ca\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.147441 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.147316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-config\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.147441 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.147355 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-trusted-ca-bundle\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.148835 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.148814 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-serving-cert\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.148835 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.148827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-oauth-config\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.154858 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.154838 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8kqc\" (UniqueName: \"kubernetes.io/projected/bddcb801-74e3-4f5a-b095-39274ec92ac0-kube-api-access-b8kqc\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.158180 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.158164 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-oauth-serving-cert\") pod \"console-59bdd647b4-kpczf\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.207126 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.207094 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:21:58.329084 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.329029 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59bdd647b4-kpczf"] Apr 16 18:21:58.331342 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:21:58.331318 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbddcb801_74e3_4f5a_b095_39274ec92ac0.slice/crio-9ea6db0e913fdb16bfc72461bb6cc74a6b0a00960c2475acf2866c81d399d921 WatchSource:0}: Error finding container 9ea6db0e913fdb16bfc72461bb6cc74a6b0a00960c2475acf2866c81d399d921: Status 404 returned error can't find the container with id 9ea6db0e913fdb16bfc72461bb6cc74a6b0a00960c2475acf2866c81d399d921 Apr 16 18:21:58.531823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.531792 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:21:58.532224 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.532202 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="alertmanager" containerID="cri-o://c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8" gracePeriod=120 Apr 16 18:21:58.532317 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.532272 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy-metric" containerID="cri-o://81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de" gracePeriod=120 Apr 16 18:21:58.532428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.532311 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="prom-label-proxy" containerID="cri-o://526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1" gracePeriod=120 Apr 16 18:21:58.532428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.532314 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="config-reloader" containerID="cri-o://190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c" gracePeriod=120 Apr 16 18:21:58.532428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.532327 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy" containerID="cri-o://783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb" gracePeriod=120 Apr 16 18:21:58.532428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.532290 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy-web" containerID="cri-o://319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5" gracePeriod=120 Apr 16 18:21:58.549918 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.549888 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:21:58.552034 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.552015 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edeb92c2-9fa4-40ae-bb1a-a24372d25c5e-metrics-certs\") pod \"network-metrics-daemon-dvxrp\" (UID: \"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e\") " pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:21:58.637619 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.637586 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k7xtv\"" Apr 16 18:21:58.645860 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.645823 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dvxrp" Apr 16 18:21:58.705041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.703342 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bdd647b4-kpczf" event={"ID":"bddcb801-74e3-4f5a-b095-39274ec92ac0","Type":"ContainerStarted","Data":"38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066"} Apr 16 18:21:58.705041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.703389 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bdd647b4-kpczf" event={"ID":"bddcb801-74e3-4f5a-b095-39274ec92ac0","Type":"ContainerStarted","Data":"9ea6db0e913fdb16bfc72461bb6cc74a6b0a00960c2475acf2866c81d399d921"} Apr 16 18:21:58.708041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.708010 2570 generic.go:358] "Generic (PLEG): container finished" podID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerID="526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1" exitCode=0 Apr 16 18:21:58.708041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.708041 2570 generic.go:358] "Generic (PLEG): container finished" podID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerID="783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb" exitCode=0 Apr 16 18:21:58.708215 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.708076 2570 generic.go:358] "Generic (PLEG): container finished" podID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerID="190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c" exitCode=0 Apr 16 18:21:58.708215 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.708086 2570 generic.go:358] "Generic (PLEG): container finished" podID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerID="c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8" exitCode=0 Apr 16 18:21:58.708215 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.708106 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerDied","Data":"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1"} Apr 16 18:21:58.708215 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.708150 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerDied","Data":"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb"} Apr 16 18:21:58.708215 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.708164 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerDied","Data":"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c"} Apr 16 18:21:58.708215 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.708176 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerDied","Data":"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8"} Apr 16 18:21:58.724833 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.724730 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59bdd647b4-kpczf" podStartSLOduration=1.7247098410000001 podStartE2EDuration="1.724709841s" podCreationTimestamp="2026-04-16 18:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:21:58.723670897 +0000 UTC m=+252.503135231" watchObservedRunningTime="2026-04-16 18:21:58.724709841 +0000 UTC m=+252.504174173" Apr 16 18:21:58.769045 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:58.768990 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dvxrp"] Apr 16 18:21:58.771259 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:21:58.771231 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedeb92c2_9fa4_40ae_bb1a_a24372d25c5e.slice/crio-10b07bcab71271bdf88d7a0917650d829e10644f301a69f520ce582f1323af14 WatchSource:0}: Error finding container 10b07bcab71271bdf88d7a0917650d829e10644f301a69f520ce582f1323af14: Status 404 returned error can't find the container with id 10b07bcab71271bdf88d7a0917650d829e10644f301a69f520ce582f1323af14 Apr 16 18:21:59.712574 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:59.712541 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dvxrp" event={"ID":"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e","Type":"ContainerStarted","Data":"10b07bcab71271bdf88d7a0917650d829e10644f301a69f520ce582f1323af14"} Apr 16 18:21:59.966777 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:21:59.966755 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.063823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.063796 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.063985 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.063843 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-main-db\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.063985 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.063875 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.063985 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.063908 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-metrics-client-ca\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.063985 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.063929 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-web-config\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.063985 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.063953 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-config-volume\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.063985 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.063982 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-trusted-ca-bundle\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.064313 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064024 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-config-out\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.064313 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064050 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-cluster-tls-config\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.064313 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064118 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.064313 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064162 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-web\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.064313 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064200 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5chm\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-kube-api-access-q5chm\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.064313 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064248 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-tls-assets\") pod \"8ab89ed0-19d8-40ee-8801-b4f260443290\" (UID: \"8ab89ed0-19d8-40ee-8801-b4f260443290\") " Apr 16 18:22:00.064579 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064422 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:00.064579 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064438 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:00.064579 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064511 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-metrics-client-ca\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.064579 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064529 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.064579 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.064506 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:00.066933 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.066903 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:00.067263 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.067226 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:00.067382 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.067279 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-config-volume" (OuterVolumeSpecName: "config-volume") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:00.067382 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.067300 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:00.067509 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.067486 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-config-out" (OuterVolumeSpecName: "config-out") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:00.067702 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.067673 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:00.068266 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.068244 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-kube-api-access-q5chm" (OuterVolumeSpecName: "kube-api-access-q5chm") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "kube-api-access-q5chm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:00.068525 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.068508 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:00.071221 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.071201 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:00.077522 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.077501 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-web-config" (OuterVolumeSpecName: "web-config") pod "8ab89ed0-19d8-40ee-8801-b4f260443290" (UID: "8ab89ed0-19d8-40ee-8801-b4f260443290"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:00.165870 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165840 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.165870 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165867 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q5chm\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-kube-api-access-q5chm\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165877 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ab89ed0-19d8-40ee-8801-b4f260443290-tls-assets\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165887 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-main-tls\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165896 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-alertmanager-main-db\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165906 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165916 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-web-config\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165925 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-config-volume\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165932 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ab89ed0-19d8-40ee-8801-b4f260443290-config-out\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165939 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-cluster-tls-config\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.166010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.165949 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ab89ed0-19d8-40ee-8801-b4f260443290-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:00.721844 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.721758 2570 generic.go:358] "Generic (PLEG): container finished" podID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerID="81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de" exitCode=0 Apr 16 18:22:00.721844 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.721784 2570 generic.go:358] "Generic (PLEG): container finished" podID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerID="319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5" exitCode=0 Apr 16 18:22:00.722306 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.721836 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerDied","Data":"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de"} Apr 16 18:22:00.722306 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.721862 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.722306 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.721874 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerDied","Data":"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5"} Apr 16 18:22:00.722306 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.721890 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ab89ed0-19d8-40ee-8801-b4f260443290","Type":"ContainerDied","Data":"8a06e17d2423023d8a81131ba03ff03839ab7d31e799e5d90cd81ae0c76f1930"} Apr 16 18:22:00.722306 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.721911 2570 scope.go:117] "RemoveContainer" containerID="526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1" Apr 16 18:22:00.723620 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.723588 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dvxrp" event={"ID":"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e","Type":"ContainerStarted","Data":"538182d1995bd53b794bc1a8e7aab976414a822500375fe58081f9af9e93055e"} Apr 16 18:22:00.723620 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.723616 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dvxrp" event={"ID":"edeb92c2-9fa4-40ae-bb1a-a24372d25c5e","Type":"ContainerStarted","Data":"454c22e918b01cb3ce115a7b5f98a21b5cdd66dc7397af068de6b6533993234d"} Apr 16 18:22:00.729726 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.729687 2570 scope.go:117] "RemoveContainer" containerID="81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de" Apr 16 18:22:00.736371 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.736355 2570 scope.go:117] "RemoveContainer" containerID="783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb" Apr 16 18:22:00.742704 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.742688 2570 scope.go:117] "RemoveContainer" containerID="319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5" Apr 16 18:22:00.747678 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.747612 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dvxrp" podStartSLOduration=253.345625865 podStartE2EDuration="4m14.747597142s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:21:58.773040393 +0000 UTC m=+252.552504702" lastFinishedPulling="2026-04-16 18:22:00.175011653 +0000 UTC m=+253.954475979" observedRunningTime="2026-04-16 18:22:00.74738166 +0000 UTC m=+254.526845991" watchObservedRunningTime="2026-04-16 18:22:00.747597142 +0000 UTC m=+254.527061471" Apr 16 18:22:00.750250 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.750235 2570 scope.go:117] "RemoveContainer" containerID="190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c" Apr 16 18:22:00.756823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.756807 2570 scope.go:117] "RemoveContainer" containerID="c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8" Apr 16 18:22:00.763404 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.763388 2570 scope.go:117] "RemoveContainer" containerID="e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce" Apr 16 18:22:00.766859 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.766837 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:22:00.770317 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.770299 2570 scope.go:117] "RemoveContainer" containerID="526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1" Apr 16 18:22:00.770740 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:22:00.770719 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1\": container with ID starting with 526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1 not found: ID does not exist" containerID="526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1" Apr 16 18:22:00.770817 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.770747 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1"} err="failed to get container status \"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1\": rpc error: code = NotFound desc = could not find container \"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1\": container with ID starting with 526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1 not found: ID does not exist" Apr 16 18:22:00.770817 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.770764 2570 scope.go:117] "RemoveContainer" containerID="81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de" Apr 16 18:22:00.771109 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:22:00.771084 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de\": container with ID starting with 81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de not found: ID does not exist" containerID="81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de" Apr 16 18:22:00.771191 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.771117 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de"} err="failed to get container status \"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de\": rpc error: code = NotFound desc = could not find container \"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de\": container with ID starting with 81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de not found: ID does not exist" Apr 16 18:22:00.771191 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.771139 2570 scope.go:117] "RemoveContainer" containerID="783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb" Apr 16 18:22:00.771432 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:22:00.771413 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb\": container with ID starting with 783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb not found: ID does not exist" containerID="783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb" Apr 16 18:22:00.771515 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.771437 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb"} err="failed to get container status \"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb\": rpc error: code = NotFound desc = could not find container \"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb\": container with ID starting with 783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb not found: ID does not exist" Apr 16 18:22:00.771515 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.771456 2570 scope.go:117] "RemoveContainer" containerID="319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5" Apr 16 18:22:00.771740 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:22:00.771706 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5\": container with ID starting with 319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5 not found: ID does not exist" containerID="319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5" Apr 16 18:22:00.771782 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.771742 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5"} err="failed to get container status \"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5\": rpc error: code = NotFound desc = could not find container \"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5\": container with ID starting with 319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5 not found: ID does not exist" Apr 16 18:22:00.771782 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.771761 2570 scope.go:117] "RemoveContainer" containerID="190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c" Apr 16 18:22:00.772015 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:22:00.771996 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c\": container with ID starting with 190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c not found: ID does not exist" containerID="190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c" Apr 16 18:22:00.772081 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772019 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c"} err="failed to get container status \"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c\": rpc error: code = NotFound desc = could not find container \"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c\": container with ID starting with 190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c not found: ID does not exist" Apr 16 18:22:00.772081 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772036 2570 scope.go:117] "RemoveContainer" containerID="c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8" Apr 16 18:22:00.772302 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:22:00.772278 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8\": container with ID starting with c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8 not found: ID does not exist" containerID="c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8" Apr 16 18:22:00.772356 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772309 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8"} err="failed to get container status \"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8\": rpc error: code = NotFound desc = could not find container \"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8\": container with ID starting with c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8 not found: ID does not exist" Apr 16 18:22:00.772356 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772331 2570 scope.go:117] "RemoveContainer" containerID="e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce" Apr 16 18:22:00.772548 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:22:00.772526 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce\": container with ID starting with e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce not found: ID does not exist" containerID="e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce" Apr 16 18:22:00.772655 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772551 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce"} err="failed to get container status \"e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce\": rpc error: code = NotFound desc = could not find container \"e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce\": container with ID starting with e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce not found: ID does not exist" Apr 16 18:22:00.772655 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772564 2570 scope.go:117] "RemoveContainer" containerID="526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1" Apr 16 18:22:00.772766 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772755 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1"} err="failed to get container status \"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1\": rpc error: code = NotFound desc = could not find container \"526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1\": container with ID starting with 526f23457ae5bab84fa5fac4c79b2ea9e5e19a13f11980207e5e55f8c34f32d1 not found: ID does not exist" Apr 16 18:22:00.772823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772769 2570 scope.go:117] "RemoveContainer" containerID="81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de" Apr 16 18:22:00.772989 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772972 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de"} err="failed to get container status \"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de\": rpc error: code = NotFound desc = could not find container \"81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de\": container with ID starting with 81b31194764983a0f901dfe62708fad5922d9cc388a5ec48335ef2689a1316de not found: ID does not exist" Apr 16 18:22:00.773048 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.772988 2570 scope.go:117] "RemoveContainer" containerID="783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb" Apr 16 18:22:00.773262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773236 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb"} err="failed to get container status \"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb\": rpc error: code = NotFound desc = could not find container \"783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb\": container with ID starting with 783f0b135c1caca2c0162413dc0a840aeb08399403dce12b11416e28f07750cb not found: ID does not exist" Apr 16 18:22:00.773262 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773254 2570 scope.go:117] "RemoveContainer" containerID="319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5" Apr 16 18:22:00.773364 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773281 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:22:00.773455 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773436 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5"} err="failed to get container status \"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5\": rpc error: code = NotFound desc = could not find container \"319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5\": container with ID starting with 319edb12731bf36ad43c402ad2b73bc8c12c48e0d3eab7b723db65847503b6f5 not found: ID does not exist" Apr 16 18:22:00.773504 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773455 2570 scope.go:117] "RemoveContainer" containerID="190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c" Apr 16 18:22:00.773662 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773644 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c"} err="failed to get container status \"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c\": rpc error: code = NotFound desc = could not find container \"190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c\": container with ID starting with 190e64aa29bc222b1ed71399e8122a9e55e317f6afd09f4cbb6b6336561b303c not found: ID does not exist" Apr 16 18:22:00.773722 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773666 2570 scope.go:117] "RemoveContainer" containerID="c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8" Apr 16 18:22:00.773881 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773861 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8"} err="failed to get container status \"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8\": rpc error: code = NotFound desc = could not find container \"c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8\": container with ID starting with c2c593efad5e19e4b123cca108095f634964c94c54ef40200248568b938865c8 not found: ID does not exist" Apr 16 18:22:00.773923 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.773882 2570 scope.go:117] "RemoveContainer" containerID="e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce" Apr 16 18:22:00.774114 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.774097 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce"} err="failed to get container status \"e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce\": rpc error: code = NotFound desc = could not find container \"e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce\": container with ID starting with e81917fa7a9f3094ed95cde36c2b2c72302c33aca94db1023a6ec235ce4100ce not found: ID does not exist" Apr 16 18:22:00.812561 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812532 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:22:00.812875 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812861 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy-web" Apr 16 18:22:00.812922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812876 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy-web" Apr 16 18:22:00.812922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812889 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="alertmanager" Apr 16 18:22:00.812922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812894 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="alertmanager" Apr 16 18:22:00.812922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812900 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy-metric" Apr 16 18:22:00.812922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812905 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy-metric" Apr 16 18:22:00.812922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812916 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy" Apr 16 18:22:00.812922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812920 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812929 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="prom-label-proxy" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812934 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="prom-label-proxy" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812944 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="init-config-reloader" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812949 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="init-config-reloader" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812957 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="config-reloader" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.812962 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="config-reloader" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.813005 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy-metric" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.813013 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="prom-label-proxy" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.813020 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="alertmanager" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.813028 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="config-reloader" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.813033 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy" Apr 16 18:22:00.813166 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.813039 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" containerName="kube-rbac-proxy-web" Apr 16 18:22:00.817960 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.817945 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.820680 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.820650 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:22:00.820801 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.820650 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:22:00.820801 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.820649 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6tmsv\"" Apr 16 18:22:00.820941 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.820928 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:22:00.821071 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.821019 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:22:00.821071 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.821027 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:22:00.821176 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.821077 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:22:00.821176 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.821135 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:22:00.821514 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.821500 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:22:00.831961 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.831269 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:22:00.838359 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.838336 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab89ed0-19d8-40ee-8801-b4f260443290" path="/var/lib/kubelet/pods/8ab89ed0-19d8-40ee-8801-b4f260443290/volumes" Apr 16 18:22:00.838829 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.838816 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:22:00.973597 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.973499 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.973597 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.973599 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50d4252d-1296-4c50-8912-925ab6d41f3c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.973814 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.973793 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50d4252d-1296-4c50-8912-925ab6d41f3c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.973891 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.973825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50d4252d-1296-4c50-8912-925ab6d41f3c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.973891 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.973845 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.973891 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.973869 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/50d4252d-1296-4c50-8912-925ab6d41f3c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.974070 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.973973 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.974148 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.974043 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-config-volume\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.974202 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.974145 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.974253 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.974202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50d4252d-1296-4c50-8912-925ab6d41f3c-config-out\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.974327 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.974308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-web-config\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.974385 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.974350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:00.974385 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:00.974371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pq4f\" (UniqueName: \"kubernetes.io/projected/50d4252d-1296-4c50-8912-925ab6d41f3c-kube-api-access-9pq4f\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074790 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074790 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074794 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pq4f\" (UniqueName: \"kubernetes.io/projected/50d4252d-1296-4c50-8912-925ab6d41f3c-kube-api-access-9pq4f\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074817 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50d4252d-1296-4c50-8912-925ab6d41f3c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074870 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50d4252d-1296-4c50-8912-925ab6d41f3c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074887 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50d4252d-1296-4c50-8912-925ab6d41f3c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/50d4252d-1296-4c50-8912-925ab6d41f3c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.074987 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.074972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.075379 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.075010 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-config-volume\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.075379 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.075048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.075379 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.075110 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50d4252d-1296-4c50-8912-925ab6d41f3c-config-out\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.075379 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.075131 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-web-config\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.075775 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.075747 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50d4252d-1296-4c50-8912-925ab6d41f3c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.075851 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.075778 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50d4252d-1296-4c50-8912-925ab6d41f3c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.076075 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.076035 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/50d4252d-1296-4c50-8912-925ab6d41f3c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.078129 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.077889 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-web-config\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.078129 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.078000 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.078129 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.078072 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-config-volume\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.078364 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.078325 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.078436 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.078412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.078486 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.078467 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.078593 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.078577 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50d4252d-1296-4c50-8912-925ab6d41f3c-config-out\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.078625 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.078606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/50d4252d-1296-4c50-8912-925ab6d41f3c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.079692 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.079672 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50d4252d-1296-4c50-8912-925ab6d41f3c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.084513 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.084494 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pq4f\" (UniqueName: \"kubernetes.io/projected/50d4252d-1296-4c50-8912-925ab6d41f3c-kube-api-access-9pq4f\") pod \"alertmanager-main-0\" (UID: \"50d4252d-1296-4c50-8912-925ab6d41f3c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.132131 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.132099 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:22:01.281018 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.280988 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:22:01.282307 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:22:01.282282 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d4252d_1296_4c50_8912_925ab6d41f3c.slice/crio-dfe4f7f589f296d2a65890dd9fd024f6e4e7728231c67947bf88d64a4a077139 WatchSource:0}: Error finding container dfe4f7f589f296d2a65890dd9fd024f6e4e7728231c67947bf88d64a4a077139: Status 404 returned error can't find the container with id dfe4f7f589f296d2a65890dd9fd024f6e4e7728231c67947bf88d64a4a077139 Apr 16 18:22:01.728271 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.728192 2570 generic.go:358] "Generic (PLEG): container finished" podID="50d4252d-1296-4c50-8912-925ab6d41f3c" containerID="26a623614e0e52586253ada4bf4bcad6ed6902c6e3695ca8355aae7ad6cad2db" exitCode=0 Apr 16 18:22:01.728619 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.728278 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"50d4252d-1296-4c50-8912-925ab6d41f3c","Type":"ContainerDied","Data":"26a623614e0e52586253ada4bf4bcad6ed6902c6e3695ca8355aae7ad6cad2db"} Apr 16 18:22:01.728619 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:01.728311 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"50d4252d-1296-4c50-8912-925ab6d41f3c","Type":"ContainerStarted","Data":"dfe4f7f589f296d2a65890dd9fd024f6e4e7728231c67947bf88d64a4a077139"} Apr 16 18:22:02.735173 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:02.735138 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"50d4252d-1296-4c50-8912-925ab6d41f3c","Type":"ContainerStarted","Data":"3d69c5c825280710e5e78e6f8ebd24b170035ef4df2dfb2486c917f2e91b201e"} Apr 16 18:22:02.735173 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:02.735171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"50d4252d-1296-4c50-8912-925ab6d41f3c","Type":"ContainerStarted","Data":"4d3c92869b3910be844c3c6acdb9a5d5fbda8b03d56fe82df1c967ac36f9724a"} Apr 16 18:22:02.735173 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:02.735182 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"50d4252d-1296-4c50-8912-925ab6d41f3c","Type":"ContainerStarted","Data":"fd861d65d9a8842da6dad3e687e03774f5bd0575cd7952d18db2f4097b30fbf2"} Apr 16 18:22:02.735746 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:02.735191 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"50d4252d-1296-4c50-8912-925ab6d41f3c","Type":"ContainerStarted","Data":"1f3be9d21528ac01beb13fbcebbab020a303a1184a943f1d8bba0fd1d11bd359"} Apr 16 18:22:02.735746 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:02.735199 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"50d4252d-1296-4c50-8912-925ab6d41f3c","Type":"ContainerStarted","Data":"af85c69956553b2edeb142f974c3515bb6e9c96cfea89c2833149a12f4dc58cb"} Apr 16 18:22:02.735746 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:02.735207 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"50d4252d-1296-4c50-8912-925ab6d41f3c","Type":"ContainerStarted","Data":"85b1f827aadd72c3c44e27a3099297d5d89278a7ff9ced417a76af253c29ce14"} Apr 16 18:22:02.790645 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:02.790598 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.790583552 podStartE2EDuration="2.790583552s" podCreationTimestamp="2026-04-16 18:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:22:02.785537941 +0000 UTC m=+256.565002282" watchObservedRunningTime="2026-04-16 18:22:02.790583552 +0000 UTC m=+256.570047883" Apr 16 18:22:08.207701 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:08.207664 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:22:08.207701 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:08.207709 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:22:08.212136 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:08.212113 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:22:08.757137 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:08.757112 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:22:08.819337 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:08.819299 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7685656dcb-skmjz"] Apr 16 18:22:31.313724 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.313681 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-l5lq6"] Apr 16 18:22:31.317120 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.317100 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.319662 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.319647 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:22:31.322890 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.322859 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l5lq6"] Apr 16 18:22:31.410887 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.410850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90529082-bc95-49c6-a8ff-05a611805241-kubelet-config\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.410887 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.410895 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90529082-bc95-49c6-a8ff-05a611805241-dbus\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.411163 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.411027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90529082-bc95-49c6-a8ff-05a611805241-original-pull-secret\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.512446 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.512404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90529082-bc95-49c6-a8ff-05a611805241-original-pull-secret\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.512446 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.512456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90529082-bc95-49c6-a8ff-05a611805241-kubelet-config\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.512685 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.512476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90529082-bc95-49c6-a8ff-05a611805241-dbus\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.512685 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.512570 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90529082-bc95-49c6-a8ff-05a611805241-kubelet-config\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.512685 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.512613 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90529082-bc95-49c6-a8ff-05a611805241-dbus\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.514694 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.514665 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90529082-bc95-49c6-a8ff-05a611805241-original-pull-secret\") pod \"global-pull-secret-syncer-l5lq6\" (UID: \"90529082-bc95-49c6-a8ff-05a611805241\") " pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.626929 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.626826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l5lq6" Apr 16 18:22:31.770296 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.770266 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l5lq6"] Apr 16 18:22:31.773536 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:22:31.773506 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90529082_bc95_49c6_a8ff_05a611805241.slice/crio-7a675bf7ee1a02bbb257a8e6d7860f755a623045752621bc535e98194b2578f3 WatchSource:0}: Error finding container 7a675bf7ee1a02bbb257a8e6d7860f755a623045752621bc535e98194b2578f3: Status 404 returned error can't find the container with id 7a675bf7ee1a02bbb257a8e6d7860f755a623045752621bc535e98194b2578f3 Apr 16 18:22:31.824129 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:31.824078 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l5lq6" event={"ID":"90529082-bc95-49c6-a8ff-05a611805241","Type":"ContainerStarted","Data":"7a675bf7ee1a02bbb257a8e6d7860f755a623045752621bc535e98194b2578f3"} Apr 16 18:22:33.839840 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:33.839770 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7685656dcb-skmjz" podUID="7163fad7-77d3-43e7-9464-6752911355f3" containerName="console" containerID="cri-o://c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00" gracePeriod=15 Apr 16 18:22:34.093830 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.093771 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7685656dcb-skmjz_7163fad7-77d3-43e7-9464-6752911355f3/console/0.log" Apr 16 18:22:34.093951 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.093835 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:22:34.134025 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.133990 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-service-ca\") pod \"7163fad7-77d3-43e7-9464-6752911355f3\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " Apr 16 18:22:34.134230 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134077 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbn5k\" (UniqueName: \"kubernetes.io/projected/7163fad7-77d3-43e7-9464-6752911355f3-kube-api-access-gbn5k\") pod \"7163fad7-77d3-43e7-9464-6752911355f3\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " Apr 16 18:22:34.134230 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134178 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-console-config\") pod \"7163fad7-77d3-43e7-9464-6752911355f3\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " Apr 16 18:22:34.134350 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134240 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-oauth-config\") pod \"7163fad7-77d3-43e7-9464-6752911355f3\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " Apr 16 18:22:34.134350 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134278 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-serving-cert\") pod \"7163fad7-77d3-43e7-9464-6752911355f3\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " Apr 16 18:22:34.134350 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134328 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-trusted-ca-bundle\") pod \"7163fad7-77d3-43e7-9464-6752911355f3\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " Apr 16 18:22:34.134487 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134354 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-oauth-serving-cert\") pod \"7163fad7-77d3-43e7-9464-6752911355f3\" (UID: \"7163fad7-77d3-43e7-9464-6752911355f3\") " Apr 16 18:22:34.134550 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134513 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-service-ca" (OuterVolumeSpecName: "service-ca") pod "7163fad7-77d3-43e7-9464-6752911355f3" (UID: "7163fad7-77d3-43e7-9464-6752911355f3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:34.134776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134676 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-service-ca\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:34.134874 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134820 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7163fad7-77d3-43e7-9464-6752911355f3" (UID: "7163fad7-77d3-43e7-9464-6752911355f3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:34.134874 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134832 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-console-config" (OuterVolumeSpecName: "console-config") pod "7163fad7-77d3-43e7-9464-6752911355f3" (UID: "7163fad7-77d3-43e7-9464-6752911355f3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:34.134973 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.134870 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7163fad7-77d3-43e7-9464-6752911355f3" (UID: "7163fad7-77d3-43e7-9464-6752911355f3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:34.136880 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.136833 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7163fad7-77d3-43e7-9464-6752911355f3" (UID: "7163fad7-77d3-43e7-9464-6752911355f3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:34.137093 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.137047 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7163fad7-77d3-43e7-9464-6752911355f3-kube-api-access-gbn5k" (OuterVolumeSpecName: "kube-api-access-gbn5k") pod "7163fad7-77d3-43e7-9464-6752911355f3" (UID: "7163fad7-77d3-43e7-9464-6752911355f3"). InnerVolumeSpecName "kube-api-access-gbn5k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:34.137201 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.137111 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7163fad7-77d3-43e7-9464-6752911355f3" (UID: "7163fad7-77d3-43e7-9464-6752911355f3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:34.235396 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.235362 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbn5k\" (UniqueName: \"kubernetes.io/projected/7163fad7-77d3-43e7-9464-6752911355f3-kube-api-access-gbn5k\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:34.235396 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.235399 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-console-config\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:34.235611 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.235415 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-oauth-config\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:34.235611 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.235429 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7163fad7-77d3-43e7-9464-6752911355f3-console-serving-cert\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:34.235611 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.235442 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-trusted-ca-bundle\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:34.235611 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.235458 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7163fad7-77d3-43e7-9464-6752911355f3-oauth-serving-cert\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:22:34.837579 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.837536 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7685656dcb-skmjz_7163fad7-77d3-43e7-9464-6752911355f3/console/0.log" Apr 16 18:22:34.837766 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.837594 2570 generic.go:358] "Generic (PLEG): container finished" podID="7163fad7-77d3-43e7-9464-6752911355f3" containerID="c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00" exitCode=2 Apr 16 18:22:34.837766 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.837717 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7685656dcb-skmjz" Apr 16 18:22:34.838544 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.838510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7685656dcb-skmjz" event={"ID":"7163fad7-77d3-43e7-9464-6752911355f3","Type":"ContainerDied","Data":"c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00"} Apr 16 18:22:34.838658 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.838552 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7685656dcb-skmjz" event={"ID":"7163fad7-77d3-43e7-9464-6752911355f3","Type":"ContainerDied","Data":"961229caead4176c3fb35f4ba3df80715816d39e48b3d68c7d761adbeb44af7f"} Apr 16 18:22:34.838658 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.838581 2570 scope.go:117] "RemoveContainer" containerID="c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00" Apr 16 18:22:34.865989 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.865957 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7685656dcb-skmjz"] Apr 16 18:22:34.873294 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:34.873262 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7685656dcb-skmjz"] Apr 16 18:22:35.765071 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:35.764892 2570 scope.go:117] "RemoveContainer" containerID="c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00" Apr 16 18:22:35.765337 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:22:35.765316 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00\": container with ID starting with c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00 not found: ID does not exist" containerID="c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00" Apr 16 18:22:35.765437 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:35.765349 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00"} err="failed to get container status \"c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00\": rpc error: code = NotFound desc = could not find container \"c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00\": container with ID starting with c9c436947802125098df24c74155416e8875110c4ded3dff1a5e5f0c4dbfda00 not found: ID does not exist" Apr 16 18:22:36.838102 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:36.838044 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7163fad7-77d3-43e7-9464-6752911355f3" path="/var/lib/kubelet/pods/7163fad7-77d3-43e7-9464-6752911355f3/volumes" Apr 16 18:22:36.846240 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:36.846206 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l5lq6" event={"ID":"90529082-bc95-49c6-a8ff-05a611805241","Type":"ContainerStarted","Data":"6b3b06653b9eaf8d8366cc70fb6a0bbb2f192697c50e856bceebdb1ab6f1c458"} Apr 16 18:22:36.869475 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:36.869426 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-l5lq6" podStartSLOduration=1.8232427979999999 podStartE2EDuration="5.869410684s" podCreationTimestamp="2026-04-16 18:22:31 +0000 UTC" firstStartedPulling="2026-04-16 18:22:31.775151967 +0000 UTC m=+285.554616276" lastFinishedPulling="2026-04-16 18:22:35.821319851 +0000 UTC m=+289.600784162" observedRunningTime="2026-04-16 18:22:36.868482212 +0000 UTC m=+290.647946557" watchObservedRunningTime="2026-04-16 18:22:36.869410684 +0000 UTC m=+290.648875049" Apr 16 18:22:46.707535 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:46.707507 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:22:46.709979 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:46.709962 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:22:46.711469 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:46.711438 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:22:51.450714 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.450669 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj"] Apr 16 18:22:51.453156 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.451029 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7163fad7-77d3-43e7-9464-6752911355f3" containerName="console" Apr 16 18:22:51.453156 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.451041 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7163fad7-77d3-43e7-9464-6752911355f3" containerName="console" Apr 16 18:22:51.453156 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.451131 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7163fad7-77d3-43e7-9464-6752911355f3" containerName="console" Apr 16 18:22:51.454064 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.454039 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.457158 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.457136 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:22:51.457280 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.457142 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:22:51.458163 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.458149 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zx84v\"" Apr 16 18:22:51.464822 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.464802 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj"] Apr 16 18:22:51.583816 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.583757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rrv\" (UniqueName: \"kubernetes.io/projected/2b73cfe8-fd5f-4016-b0cb-32bce8603520-kube-api-access-t9rrv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.584013 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.583838 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.584013 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.583937 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.684520 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.684483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rrv\" (UniqueName: \"kubernetes.io/projected/2b73cfe8-fd5f-4016-b0cb-32bce8603520-kube-api-access-t9rrv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.684652 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.684536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.684652 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.684583 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.684944 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.684922 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.685005 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.684956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.694986 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.694962 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rrv\" (UniqueName: \"kubernetes.io/projected/2b73cfe8-fd5f-4016-b0cb-32bce8603520-kube-api-access-t9rrv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.763341 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.763247 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:22:51.889740 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.888644 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj"] Apr 16 18:22:51.893081 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:22:51.893026 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b73cfe8_fd5f_4016_b0cb_32bce8603520.slice/crio-576567e9d27463172ef8c25769b63886e36f8d7f5c0064bab779540381253ad9 WatchSource:0}: Error finding container 576567e9d27463172ef8c25769b63886e36f8d7f5c0064bab779540381253ad9: Status 404 returned error can't find the container with id 576567e9d27463172ef8c25769b63886e36f8d7f5c0064bab779540381253ad9 Apr 16 18:22:51.894738 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:51.894720 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:22:52.897461 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:52.897423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" event={"ID":"2b73cfe8-fd5f-4016-b0cb-32bce8603520","Type":"ContainerStarted","Data":"576567e9d27463172ef8c25769b63886e36f8d7f5c0064bab779540381253ad9"} Apr 16 18:22:56.911469 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:56.911430 2570 generic.go:358] "Generic (PLEG): container finished" podID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerID="93c4d05119219d5f6c284df1b2c4060f7d913bf23330c553617a2b4b9955ad02" exitCode=0 Apr 16 18:22:56.911826 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:56.911505 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" event={"ID":"2b73cfe8-fd5f-4016-b0cb-32bce8603520","Type":"ContainerDied","Data":"93c4d05119219d5f6c284df1b2c4060f7d913bf23330c553617a2b4b9955ad02"} Apr 16 18:22:58.921022 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:58.920987 2570 generic.go:358] "Generic (PLEG): container finished" podID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerID="4ba7ff88caf15e198b557b37807001e22de482cd95c3b19db2897d6b0d9df948" exitCode=0 Apr 16 18:22:58.921404 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:22:58.921041 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" event={"ID":"2b73cfe8-fd5f-4016-b0cb-32bce8603520","Type":"ContainerDied","Data":"4ba7ff88caf15e198b557b37807001e22de482cd95c3b19db2897d6b0d9df948"} Apr 16 18:23:05.947713 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:05.947679 2570 generic.go:358] "Generic (PLEG): container finished" podID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerID="be5521f11ab1cec30e03112cdc15daa9f5b22ad61a3263df29c5179e3dae1ceb" exitCode=0 Apr 16 18:23:05.948092 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:05.947719 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" event={"ID":"2b73cfe8-fd5f-4016-b0cb-32bce8603520","Type":"ContainerDied","Data":"be5521f11ab1cec30e03112cdc15daa9f5b22ad61a3263df29c5179e3dae1ceb"} Apr 16 18:23:07.071570 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.071546 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:23:07.126263 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.126233 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-bundle\") pod \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " Apr 16 18:23:07.126412 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.126274 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-util\") pod \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " Apr 16 18:23:07.126412 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.126322 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9rrv\" (UniqueName: \"kubernetes.io/projected/2b73cfe8-fd5f-4016-b0cb-32bce8603520-kube-api-access-t9rrv\") pod \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\" (UID: \"2b73cfe8-fd5f-4016-b0cb-32bce8603520\") " Apr 16 18:23:07.126831 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.126808 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-bundle" (OuterVolumeSpecName: "bundle") pod "2b73cfe8-fd5f-4016-b0cb-32bce8603520" (UID: "2b73cfe8-fd5f-4016-b0cb-32bce8603520"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:07.128553 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.128526 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b73cfe8-fd5f-4016-b0cb-32bce8603520-kube-api-access-t9rrv" (OuterVolumeSpecName: "kube-api-access-t9rrv") pod "2b73cfe8-fd5f-4016-b0cb-32bce8603520" (UID: "2b73cfe8-fd5f-4016-b0cb-32bce8603520"). InnerVolumeSpecName "kube-api-access-t9rrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:23:07.130940 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.130912 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-util" (OuterVolumeSpecName: "util") pod "2b73cfe8-fd5f-4016-b0cb-32bce8603520" (UID: "2b73cfe8-fd5f-4016-b0cb-32bce8603520"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:07.227433 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.227333 2570 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-bundle\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:23:07.227433 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.227378 2570 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b73cfe8-fd5f-4016-b0cb-32bce8603520-util\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:23:07.227433 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.227388 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9rrv\" (UniqueName: \"kubernetes.io/projected/2b73cfe8-fd5f-4016-b0cb-32bce8603520-kube-api-access-t9rrv\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:23:07.955043 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.955009 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" event={"ID":"2b73cfe8-fd5f-4016-b0cb-32bce8603520","Type":"ContainerDied","Data":"576567e9d27463172ef8c25769b63886e36f8d7f5c0064bab779540381253ad9"} Apr 16 18:23:07.955043 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.955040 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="576567e9d27463172ef8c25769b63886e36f8d7f5c0064bab779540381253ad9" Apr 16 18:23:07.955265 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:07.955080 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29czdvdj" Apr 16 18:23:13.440826 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.440792 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc"] Apr 16 18:23:13.441277 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.441134 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerName="pull" Apr 16 18:23:13.441277 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.441145 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerName="pull" Apr 16 18:23:13.441277 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.441158 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerName="util" Apr 16 18:23:13.441277 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.441163 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerName="util" Apr 16 18:23:13.441277 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.441169 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerName="extract" Apr 16 18:23:13.441277 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.441175 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerName="extract" Apr 16 18:23:13.441277 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.441218 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b73cfe8-fd5f-4016-b0cb-32bce8603520" containerName="extract" Apr 16 18:23:13.443659 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.443641 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:13.446617 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.446599 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-mnpp5\"" Apr 16 18:23:13.453992 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.453972 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:23:13.454083 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.454017 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:23:13.455808 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.455792 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:23:13.460836 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.460816 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc"] Apr 16 18:23:13.479462 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.479435 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twq7l\" (UniqueName: \"kubernetes.io/projected/f26a21db-ce9a-42f9-9c77-396944c71896-kube-api-access-twq7l\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc\" (UID: \"f26a21db-ce9a-42f9-9c77-396944c71896\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:13.479593 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.479500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f26a21db-ce9a-42f9-9c77-396944c71896-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc\" (UID: \"f26a21db-ce9a-42f9-9c77-396944c71896\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:13.580770 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.580735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twq7l\" (UniqueName: \"kubernetes.io/projected/f26a21db-ce9a-42f9-9c77-396944c71896-kube-api-access-twq7l\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc\" (UID: \"f26a21db-ce9a-42f9-9c77-396944c71896\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:13.580923 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.580824 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f26a21db-ce9a-42f9-9c77-396944c71896-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc\" (UID: \"f26a21db-ce9a-42f9-9c77-396944c71896\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:13.583208 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.583176 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f26a21db-ce9a-42f9-9c77-396944c71896-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc\" (UID: \"f26a21db-ce9a-42f9-9c77-396944c71896\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:13.591442 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.591420 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twq7l\" (UniqueName: \"kubernetes.io/projected/f26a21db-ce9a-42f9-9c77-396944c71896-kube-api-access-twq7l\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc\" (UID: \"f26a21db-ce9a-42f9-9c77-396944c71896\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:13.753764 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.753678 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:13.887010 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.885311 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc"] Apr 16 18:23:13.889394 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:23:13.889363 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26a21db_ce9a_42f9_9c77_396944c71896.slice/crio-4ab662a2d227fec7454a41871582be69e24e6a9c4cb316c749e290030627abdf WatchSource:0}: Error finding container 4ab662a2d227fec7454a41871582be69e24e6a9c4cb316c749e290030627abdf: Status 404 returned error can't find the container with id 4ab662a2d227fec7454a41871582be69e24e6a9c4cb316c749e290030627abdf Apr 16 18:23:13.973080 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:13.973017 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" event={"ID":"f26a21db-ce9a-42f9-9c77-396944c71896","Type":"ContainerStarted","Data":"4ab662a2d227fec7454a41871582be69e24e6a9c4cb316c749e290030627abdf"} Apr 16 18:23:17.911615 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:17.911579 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-hc2gp"] Apr 16 18:23:17.915434 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:17.915412 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:17.917962 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:17.917931 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:23:17.918114 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:17.917941 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:23:17.918114 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:17.918012 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-pffcp\"" Apr 16 18:23:17.926730 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:17.926694 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-hc2gp"] Apr 16 18:23:17.990538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:17.990500 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" event={"ID":"f26a21db-ce9a-42f9-9c77-396944c71896","Type":"ContainerStarted","Data":"85a7b4f13d9d702549f59d17796068210d85537340da10dafe2828b180df7292"} Apr 16 18:23:17.991326 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:17.991307 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:18.013716 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.013662 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" podStartSLOduration=1.65366022 podStartE2EDuration="5.013644548s" podCreationTimestamp="2026-04-16 18:23:13 +0000 UTC" firstStartedPulling="2026-04-16 18:23:13.891372965 +0000 UTC m=+327.670837278" lastFinishedPulling="2026-04-16 18:23:17.251357282 +0000 UTC m=+331.030821606" observedRunningTime="2026-04-16 18:23:18.011272814 +0000 UTC m=+331.790737157" watchObservedRunningTime="2026-04-16 18:23:18.013644548 +0000 UTC m=+331.793108878" Apr 16 18:23:18.022900 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.022872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skgzw\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-kube-api-access-skgzw\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.023086 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.022912 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-cabundle0\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.023086 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.023031 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.124110 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.124068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.124311 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.124126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skgzw\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-kube-api-access-skgzw\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.124311 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.124162 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-cabundle0\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.124311 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.124206 2570 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 18:23:18.124311 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.124226 2570 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:23:18.124311 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.124235 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:23:18.124311 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.124251 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-hc2gp: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:23:18.124311 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.124311 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates podName:6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101 nodeName:}" failed. No retries permitted until 2026-04-16 18:23:18.624293169 +0000 UTC m=+332.403757479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates") pod "keda-operator-ffbb595cb-hc2gp" (UID: "6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:23:18.124809 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.124790 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-cabundle0\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.135539 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.135507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skgzw\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-kube-api-access-skgzw\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.217182 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.217095 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c"] Apr 16 18:23:18.220832 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.220809 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.223182 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.223157 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:23:18.229274 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.229245 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c"] Apr 16 18:23:18.326714 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.326672 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.326918 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.326763 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.326918 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.326789 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqlh7\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-kube-api-access-dqlh7\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.428133 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.428099 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.428280 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.428178 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.428280 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.428201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqlh7\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-kube-api-access-dqlh7\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.428280 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.428241 2570 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:23:18.428280 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.428268 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:23:18.428441 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.428291 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c: references non-existent secret key: tls.crt Apr 16 18:23:18.428441 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.428357 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates podName:d677b5e7-afd2-4e4e-a786-13bb8bfea6a5 nodeName:}" failed. No retries permitted until 2026-04-16 18:23:18.928339939 +0000 UTC m=+332.707804266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates") pod "keda-metrics-apiserver-7c9f485588-d2k2c" (UID: "d677b5e7-afd2-4e4e-a786-13bb8bfea6a5") : references non-existent secret key: tls.crt Apr 16 18:23:18.428560 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.428542 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.437532 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.437502 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqlh7\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-kube-api-access-dqlh7\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.629830 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.629785 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:18.630016 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.629837 2570 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:23:18.630016 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.629858 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:23:18.630016 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.629867 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-hc2gp: references non-existent secret key: ca.crt Apr 16 18:23:18.630016 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.629916 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates podName:6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101 nodeName:}" failed. No retries permitted until 2026-04-16 18:23:19.629903104 +0000 UTC m=+333.409367414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates") pod "keda-operator-ffbb595cb-hc2gp" (UID: "6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101") : references non-existent secret key: ca.crt Apr 16 18:23:18.932590 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:18.932490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:18.933014 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.932665 2570 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:23:18.933014 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.932682 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:23:18.933014 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.932702 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c: references non-existent secret key: tls.crt Apr 16 18:23:18.933014 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:18.932767 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates podName:d677b5e7-afd2-4e4e-a786-13bb8bfea6a5 nodeName:}" failed. No retries permitted until 2026-04-16 18:23:19.932749082 +0000 UTC m=+333.712213397 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates") pod "keda-metrics-apiserver-7c9f485588-d2k2c" (UID: "d677b5e7-afd2-4e4e-a786-13bb8bfea6a5") : references non-existent secret key: tls.crt Apr 16 18:23:19.639376 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:19.639344 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:19.639566 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:19.639454 2570 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:23:19.639566 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:19.639467 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:23:19.639566 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:19.639475 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-hc2gp: references non-existent secret key: ca.crt Apr 16 18:23:19.639566 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:19.639521 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates podName:6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101 nodeName:}" failed. No retries permitted until 2026-04-16 18:23:21.639509532 +0000 UTC m=+335.418973840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates") pod "keda-operator-ffbb595cb-hc2gp" (UID: "6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101") : references non-existent secret key: ca.crt Apr 16 18:23:19.941880 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:19.941788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:19.942357 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:19.941939 2570 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:23:19.942357 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:19.941962 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:23:19.942357 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:19.941981 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c: references non-existent secret key: tls.crt Apr 16 18:23:19.942357 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:19.942038 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates podName:d677b5e7-afd2-4e4e-a786-13bb8bfea6a5 nodeName:}" failed. No retries permitted until 2026-04-16 18:23:21.942019651 +0000 UTC m=+335.721483961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates") pod "keda-metrics-apiserver-7c9f485588-d2k2c" (UID: "d677b5e7-afd2-4e4e-a786-13bb8bfea6a5") : references non-existent secret key: tls.crt Apr 16 18:23:21.658100 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:21.658045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:21.658502 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:21.658202 2570 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:23:21.658502 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:21.658222 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:23:21.658502 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:21.658231 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-hc2gp: references non-existent secret key: ca.crt Apr 16 18:23:21.658502 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:21.658284 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates podName:6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101 nodeName:}" failed. No retries permitted until 2026-04-16 18:23:25.658269416 +0000 UTC m=+339.437733725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates") pod "keda-operator-ffbb595cb-hc2gp" (UID: "6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101") : references non-existent secret key: ca.crt Apr 16 18:23:21.960330 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:21.960241 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:21.960478 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:21.960385 2570 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:23:21.960478 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:21.960402 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:23:21.960478 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:21.960419 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c: references non-existent secret key: tls.crt Apr 16 18:23:21.960478 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:23:21.960469 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates podName:d677b5e7-afd2-4e4e-a786-13bb8bfea6a5 nodeName:}" failed. No retries permitted until 2026-04-16 18:23:25.96045589 +0000 UTC m=+339.739920198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates") pod "keda-metrics-apiserver-7c9f485588-d2k2c" (UID: "d677b5e7-afd2-4e4e-a786-13bb8bfea6a5") : references non-existent secret key: tls.crt Apr 16 18:23:25.693120 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:25.693080 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:25.695397 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:25.695379 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101-certificates\") pod \"keda-operator-ffbb595cb-hc2gp\" (UID: \"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101\") " pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:25.726274 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:25.726244 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:25.850569 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:25.850529 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-hc2gp"] Apr 16 18:23:25.852779 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:23:25.852744 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dfb8d1b_aa2c_4cce_a216_9ef4aa98c101.slice/crio-030754b5b0fc6f2f6a2cf41c55e3bf52b2516ac74e9c8931dd1270c34362bd17 WatchSource:0}: Error finding container 030754b5b0fc6f2f6a2cf41c55e3bf52b2516ac74e9c8931dd1270c34362bd17: Status 404 returned error can't find the container with id 030754b5b0fc6f2f6a2cf41c55e3bf52b2516ac74e9c8931dd1270c34362bd17 Apr 16 18:23:25.996455 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:25.996371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:25.999022 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:25.998996 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d677b5e7-afd2-4e4e-a786-13bb8bfea6a5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-d2k2c\" (UID: \"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:26.016286 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:26.016256 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" event={"ID":"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101","Type":"ContainerStarted","Data":"030754b5b0fc6f2f6a2cf41c55e3bf52b2516ac74e9c8931dd1270c34362bd17"} Apr 16 18:23:26.032992 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:26.032964 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:26.155081 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:26.155037 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c"] Apr 16 18:23:26.157208 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:23:26.157184 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd677b5e7_afd2_4e4e_a786_13bb8bfea6a5.slice/crio-2c6e7c46fc84db0af5385b2df0c165b056eda926cb44c316d6ac7cbb5cab8c9b WatchSource:0}: Error finding container 2c6e7c46fc84db0af5385b2df0c165b056eda926cb44c316d6ac7cbb5cab8c9b: Status 404 returned error can't find the container with id 2c6e7c46fc84db0af5385b2df0c165b056eda926cb44c316d6ac7cbb5cab8c9b Apr 16 18:23:27.020825 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:27.020785 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" event={"ID":"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5","Type":"ContainerStarted","Data":"2c6e7c46fc84db0af5385b2df0c165b056eda926cb44c316d6ac7cbb5cab8c9b"} Apr 16 18:23:30.033426 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:30.033381 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" event={"ID":"6dfb8d1b-aa2c-4cce-a216-9ef4aa98c101","Type":"ContainerStarted","Data":"1fac8a7df39e35bd06b992b627f64f1849733c670e450733308a352c0e6fa0bd"} Apr 16 18:23:30.033886 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:30.033500 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:23:30.055493 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:30.055437 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" podStartSLOduration=9.367486551 podStartE2EDuration="13.055416796s" podCreationTimestamp="2026-04-16 18:23:17 +0000 UTC" firstStartedPulling="2026-04-16 18:23:25.853992152 +0000 UTC m=+339.633456462" lastFinishedPulling="2026-04-16 18:23:29.541922384 +0000 UTC m=+343.321386707" observedRunningTime="2026-04-16 18:23:30.053592799 +0000 UTC m=+343.833057131" watchObservedRunningTime="2026-04-16 18:23:30.055416796 +0000 UTC m=+343.834881128" Apr 16 18:23:32.042275 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:32.042233 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" event={"ID":"d677b5e7-afd2-4e4e-a786-13bb8bfea6a5","Type":"ContainerStarted","Data":"dbc2f742aed9230983b5145f5a836c25117ed1f9184f897b5661825542fc18ca"} Apr 16 18:23:32.042672 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:32.042371 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:32.060515 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:32.060459 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" podStartSLOduration=8.865444378 podStartE2EDuration="14.060442823s" podCreationTimestamp="2026-04-16 18:23:18 +0000 UTC" firstStartedPulling="2026-04-16 18:23:26.158464719 +0000 UTC m=+339.937929028" lastFinishedPulling="2026-04-16 18:23:31.353463163 +0000 UTC m=+345.132927473" observedRunningTime="2026-04-16 18:23:32.059663578 +0000 UTC m=+345.839127908" watchObservedRunningTime="2026-04-16 18:23:32.060442823 +0000 UTC m=+345.839907154" Apr 16 18:23:39.999197 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:39.999162 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-nwcfc" Apr 16 18:23:43.049920 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:43.049892 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-d2k2c" Apr 16 18:23:51.039194 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:23:51.039161 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-hc2gp" Apr 16 18:24:26.964858 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:26.964819 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vdng7"] Apr 16 18:24:26.973590 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:26.973569 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:26.976472 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:26.976447 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:24:26.976600 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:26.976452 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:24:26.976600 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:26.976519 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-6qnhw\"" Apr 16 18:24:26.977399 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:26.977379 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:24:26.985559 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:26.985529 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vdng7"] Apr 16 18:24:27.000822 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.000793 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-dhh97"] Apr 16 18:24:27.004362 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.004342 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:27.006990 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.006971 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wg9bc\"" Apr 16 18:24:27.007270 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.007252 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:24:27.013739 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.013719 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dhh97"] Apr 16 18:24:27.111600 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.111566 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8d4w\" (UniqueName: \"kubernetes.io/projected/173854b1-651f-42a8-9473-ea555cff0ced-kube-api-access-f8d4w\") pod \"seaweedfs-86cc847c5c-dhh97\" (UID: \"173854b1-651f-42a8-9473-ea555cff0ced\") " pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:27.111768 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.111634 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119dc07f-5f1f-4102-87c6-4a6a342dea03-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vdng7\" (UID: \"119dc07f-5f1f-4102-87c6-4a6a342dea03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:27.111768 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.111671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8dh2\" (UniqueName: \"kubernetes.io/projected/119dc07f-5f1f-4102-87c6-4a6a342dea03-kube-api-access-p8dh2\") pod \"llmisvc-controller-manager-68cc5db7c4-vdng7\" (UID: \"119dc07f-5f1f-4102-87c6-4a6a342dea03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:27.111768 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.111695 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/173854b1-651f-42a8-9473-ea555cff0ced-data\") pod \"seaweedfs-86cc847c5c-dhh97\" (UID: \"173854b1-651f-42a8-9473-ea555cff0ced\") " pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:27.212907 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.212872 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8dh2\" (UniqueName: \"kubernetes.io/projected/119dc07f-5f1f-4102-87c6-4a6a342dea03-kube-api-access-p8dh2\") pod \"llmisvc-controller-manager-68cc5db7c4-vdng7\" (UID: \"119dc07f-5f1f-4102-87c6-4a6a342dea03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:27.212907 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.212908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/173854b1-651f-42a8-9473-ea555cff0ced-data\") pod \"seaweedfs-86cc847c5c-dhh97\" (UID: \"173854b1-651f-42a8-9473-ea555cff0ced\") " pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:27.213127 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.212977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8d4w\" (UniqueName: \"kubernetes.io/projected/173854b1-651f-42a8-9473-ea555cff0ced-kube-api-access-f8d4w\") pod \"seaweedfs-86cc847c5c-dhh97\" (UID: \"173854b1-651f-42a8-9473-ea555cff0ced\") " pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:27.213127 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.212995 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119dc07f-5f1f-4102-87c6-4a6a342dea03-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vdng7\" (UID: \"119dc07f-5f1f-4102-87c6-4a6a342dea03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:27.213384 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.213365 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/173854b1-651f-42a8-9473-ea555cff0ced-data\") pod \"seaweedfs-86cc847c5c-dhh97\" (UID: \"173854b1-651f-42a8-9473-ea555cff0ced\") " pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:27.215482 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.215439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119dc07f-5f1f-4102-87c6-4a6a342dea03-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vdng7\" (UID: \"119dc07f-5f1f-4102-87c6-4a6a342dea03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:27.227171 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.226708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8d4w\" (UniqueName: \"kubernetes.io/projected/173854b1-651f-42a8-9473-ea555cff0ced-kube-api-access-f8d4w\") pod \"seaweedfs-86cc847c5c-dhh97\" (UID: \"173854b1-651f-42a8-9473-ea555cff0ced\") " pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:27.227171 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.227305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8dh2\" (UniqueName: \"kubernetes.io/projected/119dc07f-5f1f-4102-87c6-4a6a342dea03-kube-api-access-p8dh2\") pod \"llmisvc-controller-manager-68cc5db7c4-vdng7\" (UID: \"119dc07f-5f1f-4102-87c6-4a6a342dea03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:27.285857 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.285819 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:27.313666 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.313638 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:27.418820 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.417915 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vdng7"] Apr 16 18:24:27.422070 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:24:27.422029 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod119dc07f_5f1f_4102_87c6_4a6a342dea03.slice/crio-f3a035563376337f6e7b772297ef2e1ff9f0dea5fb95522b23898f455d8445d6 WatchSource:0}: Error finding container f3a035563376337f6e7b772297ef2e1ff9f0dea5fb95522b23898f455d8445d6: Status 404 returned error can't find the container with id f3a035563376337f6e7b772297ef2e1ff9f0dea5fb95522b23898f455d8445d6 Apr 16 18:24:27.451825 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:27.451800 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dhh97"] Apr 16 18:24:27.454017 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:24:27.453992 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod173854b1_651f_42a8_9473_ea555cff0ced.slice/crio-1a8830beece63da924c1281c86bd1ca7c034d41b496122d368237d494b4a0db0 WatchSource:0}: Error finding container 1a8830beece63da924c1281c86bd1ca7c034d41b496122d368237d494b4a0db0: Status 404 returned error can't find the container with id 1a8830beece63da924c1281c86bd1ca7c034d41b496122d368237d494b4a0db0 Apr 16 18:24:28.230538 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:28.230488 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" event={"ID":"119dc07f-5f1f-4102-87c6-4a6a342dea03","Type":"ContainerStarted","Data":"f3a035563376337f6e7b772297ef2e1ff9f0dea5fb95522b23898f455d8445d6"} Apr 16 18:24:28.232301 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:28.232257 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dhh97" event={"ID":"173854b1-651f-42a8-9473-ea555cff0ced","Type":"ContainerStarted","Data":"1a8830beece63da924c1281c86bd1ca7c034d41b496122d368237d494b4a0db0"} Apr 16 18:24:31.243971 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:31.243935 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" event={"ID":"119dc07f-5f1f-4102-87c6-4a6a342dea03","Type":"ContainerStarted","Data":"852d33d6b8d364eea3522d4e71ff8f73e4679d0e98b17b4b52b9221790707acb"} Apr 16 18:24:31.244427 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:31.244012 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:24:31.245288 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:31.245268 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dhh97" event={"ID":"173854b1-651f-42a8-9473-ea555cff0ced","Type":"ContainerStarted","Data":"9752e6aa0c793ae74c8d6b09d2cc71e5174708d94c298828657c1ad0e0aa5f3d"} Apr 16 18:24:31.245427 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:31.245379 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:24:31.267163 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:31.267123 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" podStartSLOduration=2.101363433 podStartE2EDuration="5.267111713s" podCreationTimestamp="2026-04-16 18:24:26 +0000 UTC" firstStartedPulling="2026-04-16 18:24:27.423434023 +0000 UTC m=+401.202898331" lastFinishedPulling="2026-04-16 18:24:30.589182299 +0000 UTC m=+404.368646611" observedRunningTime="2026-04-16 18:24:31.265238232 +0000 UTC m=+405.044702563" watchObservedRunningTime="2026-04-16 18:24:31.267111713 +0000 UTC m=+405.046576043" Apr 16 18:24:31.306861 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:31.306808 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-dhh97" podStartSLOduration=2.118549159 podStartE2EDuration="5.306794907s" podCreationTimestamp="2026-04-16 18:24:26 +0000 UTC" firstStartedPulling="2026-04-16 18:24:27.455404511 +0000 UTC m=+401.234868823" lastFinishedPulling="2026-04-16 18:24:30.643650258 +0000 UTC m=+404.423114571" observedRunningTime="2026-04-16 18:24:31.305765804 +0000 UTC m=+405.085230135" watchObservedRunningTime="2026-04-16 18:24:31.306794907 +0000 UTC m=+405.086259238" Apr 16 18:24:37.250637 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:24:37.250605 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-dhh97" Apr 16 18:25:02.251347 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:02.251267 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vdng7" Apr 16 18:25:37.017344 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.017307 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-4vnfs"] Apr 16 18:25:37.020809 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.020790 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:37.023445 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.023425 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:25:37.023542 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.023471 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-ml6gm\"" Apr 16 18:25:37.033105 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.033081 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-4vnfs"] Apr 16 18:25:37.197408 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.197374 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rcp\" (UniqueName: \"kubernetes.io/projected/2d0e858e-6c95-497e-b9ec-1101bf4152d5-kube-api-access-r6rcp\") pod \"model-serving-api-86f7b4b499-4vnfs\" (UID: \"2d0e858e-6c95-497e-b9ec-1101bf4152d5\") " pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:37.197575 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.197427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0e858e-6c95-497e-b9ec-1101bf4152d5-tls-certs\") pod \"model-serving-api-86f7b4b499-4vnfs\" (UID: \"2d0e858e-6c95-497e-b9ec-1101bf4152d5\") " pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:37.298324 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.298233 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rcp\" (UniqueName: \"kubernetes.io/projected/2d0e858e-6c95-497e-b9ec-1101bf4152d5-kube-api-access-r6rcp\") pod \"model-serving-api-86f7b4b499-4vnfs\" (UID: \"2d0e858e-6c95-497e-b9ec-1101bf4152d5\") " pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:37.298324 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.298297 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0e858e-6c95-497e-b9ec-1101bf4152d5-tls-certs\") pod \"model-serving-api-86f7b4b499-4vnfs\" (UID: \"2d0e858e-6c95-497e-b9ec-1101bf4152d5\") " pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:37.300590 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.300563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d0e858e-6c95-497e-b9ec-1101bf4152d5-tls-certs\") pod \"model-serving-api-86f7b4b499-4vnfs\" (UID: \"2d0e858e-6c95-497e-b9ec-1101bf4152d5\") " pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:37.307539 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.307518 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rcp\" (UniqueName: \"kubernetes.io/projected/2d0e858e-6c95-497e-b9ec-1101bf4152d5-kube-api-access-r6rcp\") pod \"model-serving-api-86f7b4b499-4vnfs\" (UID: \"2d0e858e-6c95-497e-b9ec-1101bf4152d5\") " pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:37.332513 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.332490 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:37.455663 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.455585 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-4vnfs"] Apr 16 18:25:37.457861 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:25:37.457836 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d0e858e_6c95_497e_b9ec_1101bf4152d5.slice/crio-d87aa77022f62d2dc11a6f5b55a2dc366b88fc9e986c7bb2fda842770a0721a8 WatchSource:0}: Error finding container d87aa77022f62d2dc11a6f5b55a2dc366b88fc9e986c7bb2fda842770a0721a8: Status 404 returned error can't find the container with id d87aa77022f62d2dc11a6f5b55a2dc366b88fc9e986c7bb2fda842770a0721a8 Apr 16 18:25:37.477304 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:37.477276 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-4vnfs" event={"ID":"2d0e858e-6c95-497e-b9ec-1101bf4152d5","Type":"ContainerStarted","Data":"d87aa77022f62d2dc11a6f5b55a2dc366b88fc9e986c7bb2fda842770a0721a8"} Apr 16 18:25:40.490850 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:40.490809 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-4vnfs" event={"ID":"2d0e858e-6c95-497e-b9ec-1101bf4152d5","Type":"ContainerStarted","Data":"8ed1ce7edffc56bcc8e81604ef5c50f212177754810116e803de2feffc350df9"} Apr 16 18:25:40.491283 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:40.490955 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:25:40.514146 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:40.514099 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-4vnfs" podStartSLOduration=1.185779974 podStartE2EDuration="3.514085118s" podCreationTimestamp="2026-04-16 18:25:37 +0000 UTC" firstStartedPulling="2026-04-16 18:25:37.459612029 +0000 UTC m=+471.239076338" lastFinishedPulling="2026-04-16 18:25:39.787917168 +0000 UTC m=+473.567381482" observedRunningTime="2026-04-16 18:25:40.513127893 +0000 UTC m=+474.292592247" watchObservedRunningTime="2026-04-16 18:25:40.514085118 +0000 UTC m=+474.293549446" Apr 16 18:25:51.499411 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:25:51.499379 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-4vnfs" Apr 16 18:26:06.673625 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.673595 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-654d547795-qzsm4"] Apr 16 18:26:06.677304 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.677287 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.686638 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.686610 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-654d547795-qzsm4"] Apr 16 18:26:06.747232 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.747188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5bz\" (UniqueName: \"kubernetes.io/projected/00fb3776-7d26-471c-aeee-7e3b6923b9a9-kube-api-access-xl5bz\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.747232 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.747235 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-oauth-config\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.747481 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.747264 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-config\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.747481 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.747321 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-oauth-serving-cert\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.747481 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.747359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-service-ca\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.747481 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.747387 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-trusted-ca-bundle\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.747481 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.747430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-serving-cert\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.847854 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.847821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-serving-cert\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848025 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.847886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl5bz\" (UniqueName: \"kubernetes.io/projected/00fb3776-7d26-471c-aeee-7e3b6923b9a9-kube-api-access-xl5bz\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848025 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.847913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-oauth-config\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848025 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.847934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-config\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848025 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.847958 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-oauth-serving-cert\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848025 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.847993 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-service-ca\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848298 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.848042 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-trusted-ca-bundle\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848759 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.848731 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-oauth-serving-cert\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848880 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.848858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-config\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.848950 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.848860 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-service-ca\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.849022 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.849003 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00fb3776-7d26-471c-aeee-7e3b6923b9a9-trusted-ca-bundle\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.850949 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.850929 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-oauth-config\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.850949 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.850942 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00fb3776-7d26-471c-aeee-7e3b6923b9a9-console-serving-cert\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.857332 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.857305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl5bz\" (UniqueName: \"kubernetes.io/projected/00fb3776-7d26-471c-aeee-7e3b6923b9a9-kube-api-access-xl5bz\") pod \"console-654d547795-qzsm4\" (UID: \"00fb3776-7d26-471c-aeee-7e3b6923b9a9\") " pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:06.987396 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:06.987305 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:07.121997 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:07.121863 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-654d547795-qzsm4"] Apr 16 18:26:07.124817 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:26:07.124786 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00fb3776_7d26_471c_aeee_7e3b6923b9a9.slice/crio-12450667fceffe575a8b4b3f324c8845717a5cad52f5987a617bc9271a7cf086 WatchSource:0}: Error finding container 12450667fceffe575a8b4b3f324c8845717a5cad52f5987a617bc9271a7cf086: Status 404 returned error can't find the container with id 12450667fceffe575a8b4b3f324c8845717a5cad52f5987a617bc9271a7cf086 Apr 16 18:26:07.592348 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:07.592313 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654d547795-qzsm4" event={"ID":"00fb3776-7d26-471c-aeee-7e3b6923b9a9","Type":"ContainerStarted","Data":"dae6c7c0d8ce78ece33b2f6167ef726d88c96b224e4be5a720919f6bc7b0efb8"} Apr 16 18:26:07.592348 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:07.592351 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654d547795-qzsm4" event={"ID":"00fb3776-7d26-471c-aeee-7e3b6923b9a9","Type":"ContainerStarted","Data":"12450667fceffe575a8b4b3f324c8845717a5cad52f5987a617bc9271a7cf086"} Apr 16 18:26:07.614086 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:07.614020 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-654d547795-qzsm4" podStartSLOduration=1.614005874 podStartE2EDuration="1.614005874s" podCreationTimestamp="2026-04-16 18:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:26:07.612039013 +0000 UTC m=+501.391503355" watchObservedRunningTime="2026-04-16 18:26:07.614005874 +0000 UTC m=+501.393470204" Apr 16 18:26:09.955999 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:09.955962 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk"] Apr 16 18:26:09.959594 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:09.959579 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" Apr 16 18:26:09.962102 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:09.962079 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kt2gs\"" Apr 16 18:26:09.967263 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:09.967242 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk"] Apr 16 18:26:09.972403 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:09.972380 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" Apr 16 18:26:10.129868 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:10.129825 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk"] Apr 16 18:26:10.133197 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:26:10.133129 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd291430_1c3f_47fa_bd8c_99941ebf9d84.slice/crio-781a06b869f2c78f00ba7d043bcbce10f84e493897b7ecae84aa2fb318fb02e4 WatchSource:0}: Error finding container 781a06b869f2c78f00ba7d043bcbce10f84e493897b7ecae84aa2fb318fb02e4: Status 404 returned error can't find the container with id 781a06b869f2c78f00ba7d043bcbce10f84e493897b7ecae84aa2fb318fb02e4 Apr 16 18:26:10.606509 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:10.606459 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" event={"ID":"dd291430-1c3f-47fa-bd8c-99941ebf9d84","Type":"ContainerStarted","Data":"781a06b869f2c78f00ba7d043bcbce10f84e493897b7ecae84aa2fb318fb02e4"} Apr 16 18:26:16.987543 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:16.987480 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:16.987543 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:16.987528 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:16.993153 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:16.993125 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:17.641517 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:17.641486 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-654d547795-qzsm4" Apr 16 18:26:17.727086 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:17.725220 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59bdd647b4-kpczf"] Apr 16 18:26:22.656827 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:22.656788 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" event={"ID":"dd291430-1c3f-47fa-bd8c-99941ebf9d84","Type":"ContainerStarted","Data":"a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40"} Apr 16 18:26:22.657269 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:22.656997 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" Apr 16 18:26:22.658285 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:22.658248 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:26:22.675757 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:22.675700 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" podStartSLOduration=1.710607298 podStartE2EDuration="13.675687104s" podCreationTimestamp="2026-04-16 18:26:09 +0000 UTC" firstStartedPulling="2026-04-16 18:26:10.135266856 +0000 UTC m=+503.914731172" lastFinishedPulling="2026-04-16 18:26:22.100346658 +0000 UTC m=+515.879810978" observedRunningTime="2026-04-16 18:26:22.674231981 +0000 UTC m=+516.453696312" watchObservedRunningTime="2026-04-16 18:26:22.675687104 +0000 UTC m=+516.455151435" Apr 16 18:26:23.660956 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:23.660921 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:26:33.661812 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:33.661720 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:26:42.753276 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:42.753238 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59bdd647b4-kpczf" podUID="bddcb801-74e3-4f5a-b095-39274ec92ac0" containerName="console" containerID="cri-o://38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066" gracePeriod=15 Apr 16 18:26:42.998999 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:42.998977 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59bdd647b4-kpczf_bddcb801-74e3-4f5a-b095-39274ec92ac0/console/0.log" Apr 16 18:26:42.999138 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:42.999037 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:26:43.084173 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084141 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-oauth-config\") pod \"bddcb801-74e3-4f5a-b095-39274ec92ac0\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " Apr 16 18:26:43.084358 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084183 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8kqc\" (UniqueName: \"kubernetes.io/projected/bddcb801-74e3-4f5a-b095-39274ec92ac0-kube-api-access-b8kqc\") pod \"bddcb801-74e3-4f5a-b095-39274ec92ac0\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " Apr 16 18:26:43.084358 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084201 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-oauth-serving-cert\") pod \"bddcb801-74e3-4f5a-b095-39274ec92ac0\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " Apr 16 18:26:43.084358 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084233 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-service-ca\") pod \"bddcb801-74e3-4f5a-b095-39274ec92ac0\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " Apr 16 18:26:43.084358 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084267 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-trusted-ca-bundle\") pod \"bddcb801-74e3-4f5a-b095-39274ec92ac0\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " Apr 16 18:26:43.084358 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084298 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-serving-cert\") pod \"bddcb801-74e3-4f5a-b095-39274ec92ac0\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " Apr 16 18:26:43.084358 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084314 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-config\") pod \"bddcb801-74e3-4f5a-b095-39274ec92ac0\" (UID: \"bddcb801-74e3-4f5a-b095-39274ec92ac0\") " Apr 16 18:26:43.084726 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084614 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bddcb801-74e3-4f5a-b095-39274ec92ac0" (UID: "bddcb801-74e3-4f5a-b095-39274ec92ac0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:43.084791 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084748 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-service-ca" (OuterVolumeSpecName: "service-ca") pod "bddcb801-74e3-4f5a-b095-39274ec92ac0" (UID: "bddcb801-74e3-4f5a-b095-39274ec92ac0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:43.084870 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084848 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bddcb801-74e3-4f5a-b095-39274ec92ac0" (UID: "bddcb801-74e3-4f5a-b095-39274ec92ac0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:43.084924 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.084842 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-config" (OuterVolumeSpecName: "console-config") pod "bddcb801-74e3-4f5a-b095-39274ec92ac0" (UID: "bddcb801-74e3-4f5a-b095-39274ec92ac0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:26:43.086595 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.086563 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bddcb801-74e3-4f5a-b095-39274ec92ac0-kube-api-access-b8kqc" (OuterVolumeSpecName: "kube-api-access-b8kqc") pod "bddcb801-74e3-4f5a-b095-39274ec92ac0" (UID: "bddcb801-74e3-4f5a-b095-39274ec92ac0"). InnerVolumeSpecName "kube-api-access-b8kqc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:43.086595 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.086567 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bddcb801-74e3-4f5a-b095-39274ec92ac0" (UID: "bddcb801-74e3-4f5a-b095-39274ec92ac0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:43.086739 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.086623 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bddcb801-74e3-4f5a-b095-39274ec92ac0" (UID: "bddcb801-74e3-4f5a-b095-39274ec92ac0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:26:43.185552 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.185514 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-trusted-ca-bundle\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:26:43.185552 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.185548 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-serving-cert\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:26:43.185759 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.185589 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-config\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:26:43.185759 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.185602 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bddcb801-74e3-4f5a-b095-39274ec92ac0-console-oauth-config\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:26:43.185759 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.185617 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b8kqc\" (UniqueName: \"kubernetes.io/projected/bddcb801-74e3-4f5a-b095-39274ec92ac0-kube-api-access-b8kqc\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:26:43.185759 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.185630 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-oauth-serving-cert\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:26:43.185759 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.185642 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bddcb801-74e3-4f5a-b095-39274ec92ac0-service-ca\") on node \"ip-10-0-128-74.ec2.internal\" DevicePath \"\"" Apr 16 18:26:43.661501 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.661459 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:26:43.736120 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.736093 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59bdd647b4-kpczf_bddcb801-74e3-4f5a-b095-39274ec92ac0/console/0.log" Apr 16 18:26:43.736276 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.736137 2570 generic.go:358] "Generic (PLEG): container finished" podID="bddcb801-74e3-4f5a-b095-39274ec92ac0" containerID="38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066" exitCode=2 Apr 16 18:26:43.736276 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.736208 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bdd647b4-kpczf" event={"ID":"bddcb801-74e3-4f5a-b095-39274ec92ac0","Type":"ContainerDied","Data":"38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066"} Apr 16 18:26:43.736276 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.736236 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59bdd647b4-kpczf" event={"ID":"bddcb801-74e3-4f5a-b095-39274ec92ac0","Type":"ContainerDied","Data":"9ea6db0e913fdb16bfc72461bb6cc74a6b0a00960c2475acf2866c81d399d921"} Apr 16 18:26:43.736276 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.736250 2570 scope.go:117] "RemoveContainer" containerID="38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066" Apr 16 18:26:43.736422 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.736216 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59bdd647b4-kpczf" Apr 16 18:26:43.745689 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.745671 2570 scope.go:117] "RemoveContainer" containerID="38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066" Apr 16 18:26:43.745956 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:26:43.745938 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066\": container with ID starting with 38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066 not found: ID does not exist" containerID="38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066" Apr 16 18:26:43.746002 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.745966 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066"} err="failed to get container status \"38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066\": rpc error: code = NotFound desc = could not find container \"38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066\": container with ID starting with 38d866ac2bd69f23bfafb6f6ee4ac358ca414f89d791fbe6409e99b62eb6c066 not found: ID does not exist" Apr 16 18:26:43.761009 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.760984 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59bdd647b4-kpczf"] Apr 16 18:26:43.765231 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:43.765208 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59bdd647b4-kpczf"] Apr 16 18:26:44.838424 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:44.838394 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bddcb801-74e3-4f5a-b095-39274ec92ac0" path="/var/lib/kubelet/pods/bddcb801-74e3-4f5a-b095-39274ec92ac0/volumes" Apr 16 18:26:53.661337 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:26:53.661290 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:27:03.662003 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:03.661960 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:27:13.663194 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:13.663156 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" Apr 16 18:27:40.115481 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.115437 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk"] Apr 16 18:27:40.116047 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.115691 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" containerID="cri-o://a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40" gracePeriod=30 Apr 16 18:27:40.165842 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.165805 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv"] Apr 16 18:27:40.166204 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.166191 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bddcb801-74e3-4f5a-b095-39274ec92ac0" containerName="console" Apr 16 18:27:40.166204 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.166204 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddcb801-74e3-4f5a-b095-39274ec92ac0" containerName="console" Apr 16 18:27:40.166330 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.166269 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bddcb801-74e3-4f5a-b095-39274ec92ac0" containerName="console" Apr 16 18:27:40.169536 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.169517 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" Apr 16 18:27:40.175589 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.175560 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv"] Apr 16 18:27:40.181345 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.181325 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" Apr 16 18:27:40.324970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.324938 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv"] Apr 16 18:27:40.327508 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:27:40.327476 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbefbd30e_91ed_4fac_a5aa_111f3b56f22c.slice/crio-4ebd0cb53bef17a27771364b69fce93bd400d32f3822da3df56dee783fa60454 WatchSource:0}: Error finding container 4ebd0cb53bef17a27771364b69fce93bd400d32f3822da3df56dee783fa60454: Status 404 returned error can't find the container with id 4ebd0cb53bef17a27771364b69fce93bd400d32f3822da3df56dee783fa60454 Apr 16 18:27:40.932097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.932035 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" event={"ID":"befbd30e-91ed-4fac-a5aa-111f3b56f22c","Type":"ContainerStarted","Data":"8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308"} Apr 16 18:27:40.932097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.932100 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" event={"ID":"befbd30e-91ed-4fac-a5aa-111f3b56f22c","Type":"ContainerStarted","Data":"4ebd0cb53bef17a27771364b69fce93bd400d32f3822da3df56dee783fa60454"} Apr 16 18:27:40.951082 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:40.951000 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" podStartSLOduration=0.950982469 podStartE2EDuration="950.982469ms" podCreationTimestamp="2026-04-16 18:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:27:40.950174312 +0000 UTC m=+594.729638642" watchObservedRunningTime="2026-04-16 18:27:40.950982469 +0000 UTC m=+594.730446799" Apr 16 18:27:41.937107 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:41.937075 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" Apr 16 18:27:41.938740 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:41.938708 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:27:42.941215 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:42.941178 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:27:43.367473 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.367450 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" Apr 16 18:27:43.944783 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.944751 2570 generic.go:358] "Generic (PLEG): container finished" podID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerID="a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40" exitCode=0 Apr 16 18:27:43.945243 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.944812 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" Apr 16 18:27:43.945243 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.944825 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" event={"ID":"dd291430-1c3f-47fa-bd8c-99941ebf9d84","Type":"ContainerDied","Data":"a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40"} Apr 16 18:27:43.945243 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.944850 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk" event={"ID":"dd291430-1c3f-47fa-bd8c-99941ebf9d84","Type":"ContainerDied","Data":"781a06b869f2c78f00ba7d043bcbce10f84e493897b7ecae84aa2fb318fb02e4"} Apr 16 18:27:43.945243 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.944867 2570 scope.go:117] "RemoveContainer" containerID="a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40" Apr 16 18:27:43.953104 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.953084 2570 scope.go:117] "RemoveContainer" containerID="a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40" Apr 16 18:27:43.953388 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:27:43.953369 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40\": container with ID starting with a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40 not found: ID does not exist" containerID="a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40" Apr 16 18:27:43.953451 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.953400 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40"} err="failed to get container status \"a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40\": rpc error: code = NotFound desc = could not find container \"a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40\": container with ID starting with a6927c9c6182234d41597336774fd131d8b88641ea0da68f0d6f8da1a6368d40 not found: ID does not exist" Apr 16 18:27:43.967419 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.967389 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk"] Apr 16 18:27:43.971523 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:43.971501 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3596a-predictor-79bf5f4878-g9cdk"] Apr 16 18:27:44.839160 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:44.839127 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" path="/var/lib/kubelet/pods/dd291430-1c3f-47fa-bd8c-99941ebf9d84/volumes" Apr 16 18:27:46.742366 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:46.742340 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:27:46.742886 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:46.742340 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:27:52.942015 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:27:52.941977 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:28:02.941632 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:02.941536 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:28:12.941953 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:12.941903 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:28:22.942016 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:22.941969 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:28:29.939341 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:29.939310 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp"] Apr 16 18:28:29.939818 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:29.939709 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" Apr 16 18:28:29.939818 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:29.939722 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" Apr 16 18:28:29.939818 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:29.939785 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd291430-1c3f-47fa-bd8c-99941ebf9d84" containerName="kserve-container" Apr 16 18:28:29.942822 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:29.942801 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" Apr 16 18:28:29.951454 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:29.951426 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp"] Apr 16 18:28:29.956688 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:29.956667 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" Apr 16 18:28:30.092564 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:30.092539 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp"] Apr 16 18:28:30.093908 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:28:30.093876 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bcdd96_015e_40eb_ac17_465d4d2befaa.slice/crio-29f247567855f11310ca92d21925281fd8094a111d6093dbc6c83e68bef1825c WatchSource:0}: Error finding container 29f247567855f11310ca92d21925281fd8094a111d6093dbc6c83e68bef1825c: Status 404 returned error can't find the container with id 29f247567855f11310ca92d21925281fd8094a111d6093dbc6c83e68bef1825c Apr 16 18:28:30.095801 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:30.095784 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:28:30.117360 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:30.117330 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" event={"ID":"d3bcdd96-015e-40eb-ac17-465d4d2befaa","Type":"ContainerStarted","Data":"29f247567855f11310ca92d21925281fd8094a111d6093dbc6c83e68bef1825c"} Apr 16 18:28:31.122117 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:31.122083 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" event={"ID":"d3bcdd96-015e-40eb-ac17-465d4d2befaa","Type":"ContainerStarted","Data":"1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131"} Apr 16 18:28:31.122584 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:31.122273 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" Apr 16 18:28:31.123394 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:31.123374 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:28:31.144762 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:31.144716 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" podStartSLOduration=2.14470318 podStartE2EDuration="2.14470318s" podCreationTimestamp="2026-04-16 18:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:31.14444113 +0000 UTC m=+644.923905461" watchObservedRunningTime="2026-04-16 18:28:31.14470318 +0000 UTC m=+644.924167511" Apr 16 18:28:32.125751 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:32.125716 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:28:32.942347 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:32.942316 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" Apr 16 18:28:42.126318 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:42.126269 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:28:52.125954 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:28:52.125915 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:29:02.126315 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:29:02.126268 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:29:12.126606 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:29:12.126552 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:29:22.127505 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:29:22.127473 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" Apr 16 18:32:46.777824 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:32:46.777797 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:32:46.778869 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:32:46.778844 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:37:05.069548 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.069348 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv"] Apr 16 18:37:05.070092 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.069668 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" containerID="cri-o://8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308" gracePeriod=30 Apr 16 18:37:05.086616 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.086580 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2"] Apr 16 18:37:05.090068 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.090038 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" Apr 16 18:37:05.097813 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.097789 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2"] Apr 16 18:37:05.101119 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.101102 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" Apr 16 18:37:05.238326 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.238302 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2"] Apr 16 18:37:05.240714 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:37:05.240688 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0526f50e_2db5_427f_854b_5e708f881c7d.slice/crio-72763554ba3d0625adb31fe6e3ff36baf7bb311626ee4042a52516be362db15a WatchSource:0}: Error finding container 72763554ba3d0625adb31fe6e3ff36baf7bb311626ee4042a52516be362db15a: Status 404 returned error can't find the container with id 72763554ba3d0625adb31fe6e3ff36baf7bb311626ee4042a52516be362db15a Apr 16 18:37:05.245421 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.245397 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:37:05.876806 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.876775 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" event={"ID":"0526f50e-2db5-427f-854b-5e708f881c7d","Type":"ContainerStarted","Data":"16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2"} Apr 16 18:37:05.876969 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.876814 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" event={"ID":"0526f50e-2db5-427f-854b-5e708f881c7d","Type":"ContainerStarted","Data":"72763554ba3d0625adb31fe6e3ff36baf7bb311626ee4042a52516be362db15a"} Apr 16 18:37:05.877037 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.877020 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" Apr 16 18:37:05.878312 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.878290 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:37:05.895690 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:05.895647 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podStartSLOduration=0.895634688 podStartE2EDuration="895.634688ms" podCreationTimestamp="2026-04-16 18:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:05.893681197 +0000 UTC m=+1159.673145527" watchObservedRunningTime="2026-04-16 18:37:05.895634688 +0000 UTC m=+1159.675099017" Apr 16 18:37:06.880665 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:06.880622 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:37:08.220949 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.220926 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" Apr 16 18:37:08.888469 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.888435 2570 generic.go:358] "Generic (PLEG): container finished" podID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerID="8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308" exitCode=0 Apr 16 18:37:08.888632 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.888496 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" event={"ID":"befbd30e-91ed-4fac-a5aa-111f3b56f22c","Type":"ContainerDied","Data":"8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308"} Apr 16 18:37:08.888632 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.888503 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" Apr 16 18:37:08.888632 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.888529 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv" event={"ID":"befbd30e-91ed-4fac-a5aa-111f3b56f22c","Type":"ContainerDied","Data":"4ebd0cb53bef17a27771364b69fce93bd400d32f3822da3df56dee783fa60454"} Apr 16 18:37:08.888632 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.888548 2570 scope.go:117] "RemoveContainer" containerID="8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308" Apr 16 18:37:08.896539 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.896518 2570 scope.go:117] "RemoveContainer" containerID="8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308" Apr 16 18:37:08.896776 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:37:08.896756 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308\": container with ID starting with 8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308 not found: ID does not exist" containerID="8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308" Apr 16 18:37:08.896830 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.896784 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308"} err="failed to get container status \"8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308\": rpc error: code = NotFound desc = could not find container \"8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308\": container with ID starting with 8eae2c6ea13e5f42a82cd928df76e61da8420f594eb180622e7f57ae6d4af308 not found: ID does not exist" Apr 16 18:37:08.908545 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.908517 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv"] Apr 16 18:37:08.915876 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:08.915853 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6138a-predictor-64d77c84bd-gcvtv"] Apr 16 18:37:10.839312 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:10.839283 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" path="/var/lib/kubelet/pods/befbd30e-91ed-4fac-a5aa-111f3b56f22c/volumes" Apr 16 18:37:16.881492 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:16.881441 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:37:26.881666 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:26.881621 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:37:36.881645 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:36.881608 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:37:46.807765 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:46.807735 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:37:46.810174 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:46.810156 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:37:46.881232 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:46.881193 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:37:54.769537 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.769502 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp"] Apr 16 18:37:54.769998 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.769836 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" containerID="cri-o://1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131" gracePeriod=30 Apr 16 18:37:54.797195 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.797160 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8"] Apr 16 18:37:54.797722 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.797705 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" Apr 16 18:37:54.797782 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.797725 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" Apr 16 18:37:54.797843 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.797832 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="befbd30e-91ed-4fac-a5aa-111f3b56f22c" containerName="kserve-container" Apr 16 18:37:54.803013 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.802979 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" Apr 16 18:37:54.803246 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.803222 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8"] Apr 16 18:37:54.812979 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.812961 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" Apr 16 18:37:54.946388 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:54.946362 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8"] Apr 16 18:37:54.948875 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:37:54.948845 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f26ab8_57ab_4d69_a4b6_452207b4d41f.slice/crio-00c26f30a5ef3b4cfd5a2df2a2a6c9c6b9c0ddf9ed5d455a081038056a8c642c WatchSource:0}: Error finding container 00c26f30a5ef3b4cfd5a2df2a2a6c9c6b9c0ddf9ed5d455a081038056a8c642c: Status 404 returned error can't find the container with id 00c26f30a5ef3b4cfd5a2df2a2a6c9c6b9c0ddf9ed5d455a081038056a8c642c Apr 16 18:37:55.047446 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:55.047419 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" event={"ID":"86f26ab8-57ab-4d69-a4b6-452207b4d41f","Type":"ContainerStarted","Data":"00c26f30a5ef3b4cfd5a2df2a2a6c9c6b9c0ddf9ed5d455a081038056a8c642c"} Apr 16 18:37:56.052072 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:56.052016 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" event={"ID":"86f26ab8-57ab-4d69-a4b6-452207b4d41f","Type":"ContainerStarted","Data":"198d11e1a69a9cad7a2e4a3904bb1a56639faeb90a96f2ce1b2c3001c718f6e1"} Apr 16 18:37:56.052450 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:56.052227 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" Apr 16 18:37:56.053507 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:56.053482 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:37:56.069430 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:56.069390 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podStartSLOduration=2.069379423 podStartE2EDuration="2.069379423s" podCreationTimestamp="2026-04-16 18:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:56.067981442 +0000 UTC m=+1209.847445774" watchObservedRunningTime="2026-04-16 18:37:56.069379423 +0000 UTC m=+1209.848843753" Apr 16 18:37:56.882254 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:56.882224 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" Apr 16 18:37:57.055321 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:57.055287 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:37:57.816460 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:57.816434 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" Apr 16 18:37:58.059581 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.059550 2570 generic.go:358] "Generic (PLEG): container finished" podID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerID="1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131" exitCode=0 Apr 16 18:37:58.060001 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.059611 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" Apr 16 18:37:58.060001 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.059643 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" event={"ID":"d3bcdd96-015e-40eb-ac17-465d4d2befaa","Type":"ContainerDied","Data":"1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131"} Apr 16 18:37:58.060001 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.059687 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp" event={"ID":"d3bcdd96-015e-40eb-ac17-465d4d2befaa","Type":"ContainerDied","Data":"29f247567855f11310ca92d21925281fd8094a111d6093dbc6c83e68bef1825c"} Apr 16 18:37:58.060001 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.059713 2570 scope.go:117] "RemoveContainer" containerID="1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131" Apr 16 18:37:58.068293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.068275 2570 scope.go:117] "RemoveContainer" containerID="1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131" Apr 16 18:37:58.068525 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:37:58.068505 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131\": container with ID starting with 1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131 not found: ID does not exist" containerID="1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131" Apr 16 18:37:58.068586 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.068537 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131"} err="failed to get container status \"1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131\": rpc error: code = NotFound desc = could not find container \"1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131\": container with ID starting with 1a2fcd436012139fae8bad08611db0b6cc92a20acc9174115525e4dbdfbd2131 not found: ID does not exist" Apr 16 18:37:58.083878 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.083849 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp"] Apr 16 18:37:58.088192 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.088173 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b16e7-predictor-75968db7b-fhfcp"] Apr 16 18:37:58.839038 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:37:58.838998 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" path="/var/lib/kubelet/pods/d3bcdd96-015e-40eb-ac17-465d4d2befaa/volumes" Apr 16 18:38:07.055546 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:07.055502 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:38:17.055996 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:17.055951 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:38:25.311388 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.311350 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2"] Apr 16 18:38:25.311888 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.311609 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" containerID="cri-o://16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2" gracePeriod=30 Apr 16 18:38:25.444774 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.444736 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq"] Apr 16 18:38:25.445201 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.445184 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" Apr 16 18:38:25.445288 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.445205 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" Apr 16 18:38:25.445341 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.445289 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3bcdd96-015e-40eb-ac17-465d4d2befaa" containerName="kserve-container" Apr 16 18:38:25.448333 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.448314 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" Apr 16 18:38:25.456073 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.456034 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq"] Apr 16 18:38:25.458428 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.458412 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" Apr 16 18:38:25.590599 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:25.590563 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq"] Apr 16 18:38:25.592174 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:38:25.592147 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b339f3_0104_4739_b6fc_8cca0f80fcbd.slice/crio-bd28ab760a1e9c519f9489d9055dd95288bc026d1a59128bec2585c2d28ae439 WatchSource:0}: Error finding container bd28ab760a1e9c519f9489d9055dd95288bc026d1a59128bec2585c2d28ae439: Status 404 returned error can't find the container with id bd28ab760a1e9c519f9489d9055dd95288bc026d1a59128bec2585c2d28ae439 Apr 16 18:38:26.159144 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:26.159049 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" event={"ID":"37b339f3-0104-4739-b6fc-8cca0f80fcbd","Type":"ContainerStarted","Data":"57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6"} Apr 16 18:38:26.159144 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:26.159094 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" event={"ID":"37b339f3-0104-4739-b6fc-8cca0f80fcbd","Type":"ContainerStarted","Data":"bd28ab760a1e9c519f9489d9055dd95288bc026d1a59128bec2585c2d28ae439"} Apr 16 18:38:26.159341 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:26.159210 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" Apr 16 18:38:26.160274 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:26.160251 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:38:26.175705 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:26.175661 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" podStartSLOduration=1.175650556 podStartE2EDuration="1.175650556s" podCreationTimestamp="2026-04-16 18:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:38:26.175088209 +0000 UTC m=+1239.954552538" watchObservedRunningTime="2026-04-16 18:38:26.175650556 +0000 UTC m=+1239.955114885" Apr 16 18:38:26.880938 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:26.880897 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:38:27.056029 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:27.055990 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:38:27.162548 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:27.162465 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:38:31.455647 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:31.455620 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" Apr 16 18:38:32.180985 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.180951 2570 generic.go:358] "Generic (PLEG): container finished" podID="0526f50e-2db5-427f-854b-5e708f881c7d" containerID="16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2" exitCode=0 Apr 16 18:38:32.181205 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.181020 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" Apr 16 18:38:32.181205 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.181022 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" event={"ID":"0526f50e-2db5-427f-854b-5e708f881c7d","Type":"ContainerDied","Data":"16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2"} Apr 16 18:38:32.181205 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.181092 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2" event={"ID":"0526f50e-2db5-427f-854b-5e708f881c7d","Type":"ContainerDied","Data":"72763554ba3d0625adb31fe6e3ff36baf7bb311626ee4042a52516be362db15a"} Apr 16 18:38:32.181205 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.181107 2570 scope.go:117] "RemoveContainer" containerID="16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2" Apr 16 18:38:32.188922 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.188905 2570 scope.go:117] "RemoveContainer" containerID="16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2" Apr 16 18:38:32.189193 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:38:32.189174 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2\": container with ID starting with 16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2 not found: ID does not exist" containerID="16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2" Apr 16 18:38:32.189247 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.189217 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2"} err="failed to get container status \"16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2\": rpc error: code = NotFound desc = could not find container \"16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2\": container with ID starting with 16ee9ad51e4a3241618d295b13233e4b83faa19acc5551b38895d5ba442503f2 not found: ID does not exist" Apr 16 18:38:32.202947 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.202926 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2"] Apr 16 18:38:32.207694 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.207676 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-091ad-predictor-8bbcb8548-l26p2"] Apr 16 18:38:32.838707 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:32.838667 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" path="/var/lib/kubelet/pods/0526f50e-2db5-427f-854b-5e708f881c7d/volumes" Apr 16 18:38:37.055551 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:37.055510 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:38:37.163138 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:37.163098 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:38:47.056362 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:47.056331 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" Apr 16 18:38:47.162949 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:47.162857 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:38:57.163097 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:38:57.163028 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:39:07.163322 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:07.163278 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:39:14.889613 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:14.889572 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8"] Apr 16 18:39:14.890037 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:14.889865 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" containerID="cri-o://198d11e1a69a9cad7a2e4a3904bb1a56639faeb90a96f2ce1b2c3001c718f6e1" gracePeriod=30 Apr 16 18:39:15.040260 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.040221 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb"] Apr 16 18:39:15.040676 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.040658 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" Apr 16 18:39:15.040765 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.040677 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" Apr 16 18:39:15.040816 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.040774 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0526f50e-2db5-427f-854b-5e708f881c7d" containerName="kserve-container" Apr 16 18:39:15.043942 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.043921 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" Apr 16 18:39:15.053293 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.053272 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" Apr 16 18:39:15.056924 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.056902 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb"] Apr 16 18:39:15.183757 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.183725 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb"] Apr 16 18:39:15.186801 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:39:15.186773 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e0022e_89ce_4a30_9ce0_6794d78d213d.slice/crio-67663c151357ccea7d15c331a1a6b0f65e7de46e669aa5805275208aca80325a WatchSource:0}: Error finding container 67663c151357ccea7d15c331a1a6b0f65e7de46e669aa5805275208aca80325a: Status 404 returned error can't find the container with id 67663c151357ccea7d15c331a1a6b0f65e7de46e669aa5805275208aca80325a Apr 16 18:39:15.344047 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.344010 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" event={"ID":"82e0022e-89ce-4a30-9ce0-6794d78d213d","Type":"ContainerStarted","Data":"e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c"} Apr 16 18:39:15.344047 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.344046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" event={"ID":"82e0022e-89ce-4a30-9ce0-6794d78d213d","Type":"ContainerStarted","Data":"67663c151357ccea7d15c331a1a6b0f65e7de46e669aa5805275208aca80325a"} Apr 16 18:39:15.344299 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.344270 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" Apr 16 18:39:15.345617 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.345591 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:39:15.362355 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:15.362283 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" podStartSLOduration=0.362271672 podStartE2EDuration="362.271672ms" podCreationTimestamp="2026-04-16 18:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:39:15.359999151 +0000 UTC m=+1289.139463482" watchObservedRunningTime="2026-04-16 18:39:15.362271672 +0000 UTC m=+1289.141736041" Apr 16 18:39:16.347838 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:16.347796 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:39:17.056366 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:17.056320 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:39:17.163762 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:17.163728 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" Apr 16 18:39:18.356401 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:18.356368 2570 generic.go:358] "Generic (PLEG): container finished" podID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerID="198d11e1a69a9cad7a2e4a3904bb1a56639faeb90a96f2ce1b2c3001c718f6e1" exitCode=0 Apr 16 18:39:18.358610 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:18.356445 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" event={"ID":"86f26ab8-57ab-4d69-a4b6-452207b4d41f","Type":"ContainerDied","Data":"198d11e1a69a9cad7a2e4a3904bb1a56639faeb90a96f2ce1b2c3001c718f6e1"} Apr 16 18:39:18.536539 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:18.536516 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" Apr 16 18:39:19.360776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:19.360742 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" Apr 16 18:39:19.360776 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:19.360754 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8" event={"ID":"86f26ab8-57ab-4d69-a4b6-452207b4d41f","Type":"ContainerDied","Data":"00c26f30a5ef3b4cfd5a2df2a2a6c9c6b9c0ddf9ed5d455a081038056a8c642c"} Apr 16 18:39:19.361279 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:19.360800 2570 scope.go:117] "RemoveContainer" containerID="198d11e1a69a9cad7a2e4a3904bb1a56639faeb90a96f2ce1b2c3001c718f6e1" Apr 16 18:39:19.380623 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:19.380596 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8"] Apr 16 18:39:19.383211 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:19.383185 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0f5d9-predictor-67c84f489d-xh7g8"] Apr 16 18:39:20.839415 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:20.839381 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" path="/var/lib/kubelet/pods/86f26ab8-57ab-4d69-a4b6-452207b4d41f/volumes" Apr 16 18:39:26.348879 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:26.348828 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:39:36.348655 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:36.348613 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:39:46.348415 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:46.348375 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:39:56.348104 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:39:56.347997 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:40:06.349291 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:40:06.349257 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" Apr 16 18:42:46.832448 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:42:46.832424 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:42:46.837334 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:42:46.837309 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:47:46.858104 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:46.858078 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:47:46.862524 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:46.862504 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:47:50.298019 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.297983 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq"] Apr 16 18:47:50.298576 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.298284 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" containerID="cri-o://57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6" gracePeriod=30 Apr 16 18:47:50.322046 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.322013 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w"] Apr 16 18:47:50.322403 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.322391 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" Apr 16 18:47:50.322451 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.322405 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" Apr 16 18:47:50.322493 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.322470 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="86f26ab8-57ab-4d69-a4b6-452207b4d41f" containerName="kserve-container" Apr 16 18:47:50.325968 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.325952 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" Apr 16 18:47:50.334927 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.334894 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w"] Apr 16 18:47:50.337249 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.337234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" Apr 16 18:47:50.470250 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.470223 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w"] Apr 16 18:47:50.471849 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:47:50.471820 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e2602a1_85ca_4b15_af41_4de097fe39e7.slice/crio-cd4e9bd8fdfc18020ff469b4628aaef0f3e15ad32ad469068b3587307b2930d5 WatchSource:0}: Error finding container cd4e9bd8fdfc18020ff469b4628aaef0f3e15ad32ad469068b3587307b2930d5: Status 404 returned error can't find the container with id cd4e9bd8fdfc18020ff469b4628aaef0f3e15ad32ad469068b3587307b2930d5 Apr 16 18:47:50.473604 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:50.473590 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:47:51.117952 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:51.117908 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" event={"ID":"4e2602a1-85ca-4b15-af41-4de097fe39e7","Type":"ContainerStarted","Data":"385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856"} Apr 16 18:47:51.117952 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:51.117951 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" event={"ID":"4e2602a1-85ca-4b15-af41-4de097fe39e7","Type":"ContainerStarted","Data":"cd4e9bd8fdfc18020ff469b4628aaef0f3e15ad32ad469068b3587307b2930d5"} Apr 16 18:47:51.118219 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:51.118177 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" Apr 16 18:47:51.119431 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:51.119408 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:47:51.134755 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:51.134712 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podStartSLOduration=1.1346994559999999 podStartE2EDuration="1.134699456s" podCreationTimestamp="2026-04-16 18:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:47:51.132731979 +0000 UTC m=+1804.912196311" watchObservedRunningTime="2026-04-16 18:47:51.134699456 +0000 UTC m=+1804.914163786" Apr 16 18:47:52.121837 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:52.121804 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:47:53.449823 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:53.449801 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" Apr 16 18:47:54.129750 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.129719 2570 generic.go:358] "Generic (PLEG): container finished" podID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerID="57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6" exitCode=0 Apr 16 18:47:54.129935 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.129779 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" Apr 16 18:47:54.129935 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.129798 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" event={"ID":"37b339f3-0104-4739-b6fc-8cca0f80fcbd","Type":"ContainerDied","Data":"57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6"} Apr 16 18:47:54.129935 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.129835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq" event={"ID":"37b339f3-0104-4739-b6fc-8cca0f80fcbd","Type":"ContainerDied","Data":"bd28ab760a1e9c519f9489d9055dd95288bc026d1a59128bec2585c2d28ae439"} Apr 16 18:47:54.129935 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.129847 2570 scope.go:117] "RemoveContainer" containerID="57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6" Apr 16 18:47:54.139115 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.139088 2570 scope.go:117] "RemoveContainer" containerID="57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6" Apr 16 18:47:54.139415 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:47:54.139390 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6\": container with ID starting with 57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6 not found: ID does not exist" containerID="57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6" Apr 16 18:47:54.139511 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.139422 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6"} err="failed to get container status \"57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6\": rpc error: code = NotFound desc = could not find container \"57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6\": container with ID starting with 57b0eb8aa1b2dbb2906dfbed770d37a669ca7ba739d4efc4d29f135311d919d6 not found: ID does not exist" Apr 16 18:47:54.154601 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.154559 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq"] Apr 16 18:47:54.159476 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.159450 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-976d9-predictor-6bd5967d8d-g7jhq"] Apr 16 18:47:54.838310 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:47:54.838273 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" path="/var/lib/kubelet/pods/37b339f3-0104-4739-b6fc-8cca0f80fcbd/volumes" Apr 16 18:48:02.122681 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:02.122632 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:48:12.122838 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:12.122790 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:48:22.122587 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:22.122534 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:48:32.122762 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:32.122719 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:48:39.728414 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.728375 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb"] Apr 16 18:48:39.728919 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.728642 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" containerID="cri-o://e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c" gracePeriod=30 Apr 16 18:48:39.848589 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.848553 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff"] Apr 16 18:48:39.848970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.848956 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" Apr 16 18:48:39.848970 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.848971 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" Apr 16 18:48:39.849087 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.849046 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="37b339f3-0104-4739-b6fc-8cca0f80fcbd" containerName="kserve-container" Apr 16 18:48:39.851951 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.851934 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" Apr 16 18:48:39.861289 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.861269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" Apr 16 18:48:39.866133 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.866109 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff"] Apr 16 18:48:39.994041 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:39.993963 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff"] Apr 16 18:48:39.997713 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:48:39.997675 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7af7ef2_6558_4525_be3d_8a0e4ae14439.slice/crio-64c9c5b10156da014269e1fb97c0d307d4972ca3d548db0328a6d18672bd84ed WatchSource:0}: Error finding container 64c9c5b10156da014269e1fb97c0d307d4972ca3d548db0328a6d18672bd84ed: Status 404 returned error can't find the container with id 64c9c5b10156da014269e1fb97c0d307d4972ca3d548db0328a6d18672bd84ed Apr 16 18:48:40.299642 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:40.299610 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" event={"ID":"e7af7ef2-6558-4525-be3d-8a0e4ae14439","Type":"ContainerStarted","Data":"abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248"} Apr 16 18:48:40.299642 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:40.299646 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" event={"ID":"e7af7ef2-6558-4525-be3d-8a0e4ae14439","Type":"ContainerStarted","Data":"64c9c5b10156da014269e1fb97c0d307d4972ca3d548db0328a6d18672bd84ed"} Apr 16 18:48:40.299846 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:40.299760 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" Apr 16 18:48:40.301142 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:40.301120 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:48:40.316690 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:40.316650 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podStartSLOduration=1.316637699 podStartE2EDuration="1.316637699s" podCreationTimestamp="2026-04-16 18:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:48:40.315097309 +0000 UTC m=+1854.094561638" watchObservedRunningTime="2026-04-16 18:48:40.316637699 +0000 UTC m=+1854.096102029" Apr 16 18:48:41.303634 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:41.303593 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:48:42.123452 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:42.123416 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" Apr 16 18:48:42.877377 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:42.877352 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" Apr 16 18:48:43.312078 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.312036 2570 generic.go:358] "Generic (PLEG): container finished" podID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerID="e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c" exitCode=0 Apr 16 18:48:43.312249 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.312083 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" event={"ID":"82e0022e-89ce-4a30-9ce0-6794d78d213d","Type":"ContainerDied","Data":"e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c"} Apr 16 18:48:43.312249 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.312121 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" Apr 16 18:48:43.312249 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.312136 2570 scope.go:117] "RemoveContainer" containerID="e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c" Apr 16 18:48:43.312249 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.312124 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb" event={"ID":"82e0022e-89ce-4a30-9ce0-6794d78d213d","Type":"ContainerDied","Data":"67663c151357ccea7d15c331a1a6b0f65e7de46e669aa5805275208aca80325a"} Apr 16 18:48:43.320444 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.320428 2570 scope.go:117] "RemoveContainer" containerID="e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c" Apr 16 18:48:43.320687 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:48:43.320668 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c\": container with ID starting with e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c not found: ID does not exist" containerID="e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c" Apr 16 18:48:43.320749 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.320700 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c"} err="failed to get container status \"e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c\": rpc error: code = NotFound desc = could not find container \"e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c\": container with ID starting with e44874533a52813a4ac587cd55e8fc67fbd07e2f5e9665161e22da1ded9c939c not found: ID does not exist" Apr 16 18:48:43.334663 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.334635 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb"] Apr 16 18:48:43.339777 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:43.339750 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5aeb1-predictor-6455bd6978-5jrrb"] Apr 16 18:48:44.838664 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:44.838634 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" path="/var/lib/kubelet/pods/82e0022e-89ce-4a30-9ce0-6794d78d213d/volumes" Apr 16 18:48:51.304039 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:48:51.303991 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:49:01.304527 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:01.304479 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:49:10.625984 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.625940 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w"] Apr 16 18:49:10.626890 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.626238 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" containerID="cri-o://385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856" gracePeriod=30 Apr 16 18:49:10.644402 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.644364 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7"] Apr 16 18:49:10.644843 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.644826 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" Apr 16 18:49:10.644925 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.644845 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" Apr 16 18:49:10.644977 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.644969 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="82e0022e-89ce-4a30-9ce0-6794d78d213d" containerName="kserve-container" Apr 16 18:49:10.647103 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.647084 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" Apr 16 18:49:10.655947 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.655920 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7"] Apr 16 18:49:10.657747 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.657725 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" Apr 16 18:49:10.791847 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:10.791822 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7"] Apr 16 18:49:10.794629 ip-10-0-128-74 kubenswrapper[2570]: W0416 18:49:10.794602 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392b7c5d_9a36_4cd4_a37f_e03215fff699.slice/crio-b2044013daa413f47257b37d42579165f3bea9d35db181461c626af66cbd5816 WatchSource:0}: Error finding container b2044013daa413f47257b37d42579165f3bea9d35db181461c626af66cbd5816: Status 404 returned error can't find the container with id b2044013daa413f47257b37d42579165f3bea9d35db181461c626af66cbd5816 Apr 16 18:49:11.303958 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:11.303910 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:49:11.419549 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:11.419514 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" event={"ID":"392b7c5d-9a36-4cd4-a37f-e03215fff699","Type":"ContainerStarted","Data":"6d8e55ede5d5613b5dd3ae3a8d0db0b12810f028997dd17bbe4bfa2f75fd87c7"} Apr 16 18:49:11.419549 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:11.419556 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" Apr 16 18:49:11.419767 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:11.419567 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" event={"ID":"392b7c5d-9a36-4cd4-a37f-e03215fff699","Type":"ContainerStarted","Data":"b2044013daa413f47257b37d42579165f3bea9d35db181461c626af66cbd5816"} Apr 16 18:49:11.420924 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:11.420901 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:49:11.437040 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:11.436974 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" podStartSLOduration=1.436953898 podStartE2EDuration="1.436953898s" podCreationTimestamp="2026-04-16 18:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:11.435644131 +0000 UTC m=+1885.215108461" watchObservedRunningTime="2026-04-16 18:49:11.436953898 +0000 UTC m=+1885.216418229" Apr 16 18:49:12.122563 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:12.122519 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:49:12.423673 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:12.423579 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:49:13.975040 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:13.975014 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" Apr 16 18:49:14.431708 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.431676 2570 generic.go:358] "Generic (PLEG): container finished" podID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerID="385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856" exitCode=0 Apr 16 18:49:14.431881 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.431737 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" event={"ID":"4e2602a1-85ca-4b15-af41-4de097fe39e7","Type":"ContainerDied","Data":"385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856"} Apr 16 18:49:14.431881 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.431763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" event={"ID":"4e2602a1-85ca-4b15-af41-4de097fe39e7","Type":"ContainerDied","Data":"cd4e9bd8fdfc18020ff469b4628aaef0f3e15ad32ad469068b3587307b2930d5"} Apr 16 18:49:14.431881 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.431779 2570 scope.go:117] "RemoveContainer" containerID="385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856" Apr 16 18:49:14.431881 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.431738 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w" Apr 16 18:49:14.440760 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.440739 2570 scope.go:117] "RemoveContainer" containerID="385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856" Apr 16 18:49:14.441102 ip-10-0-128-74 kubenswrapper[2570]: E0416 18:49:14.441082 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856\": container with ID starting with 385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856 not found: ID does not exist" containerID="385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856" Apr 16 18:49:14.441175 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.441110 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856"} err="failed to get container status \"385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856\": rpc error: code = NotFound desc = could not find container \"385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856\": container with ID starting with 385870aa7d4e35a3dd12121c2327dbc773ea4d13d83d33b385a7843968993856 not found: ID does not exist" Apr 16 18:49:14.458243 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.458208 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w"] Apr 16 18:49:14.462749 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.462724 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ebde8-predictor-97d45866-x4f8w"] Apr 16 18:49:14.840341 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:14.840281 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" path="/var/lib/kubelet/pods/4e2602a1-85ca-4b15-af41-4de097fe39e7/volumes" Apr 16 18:49:21.304038 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:21.303980 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 18:49:22.424195 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:22.424147 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:49:31.305171 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:31.305139 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" Apr 16 18:49:32.424197 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:32.424144 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:49:42.423810 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:42.423760 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:49:52.423682 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:49:52.423631 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:50:02.424898 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:50:02.424869 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" Apr 16 18:52:46.883639 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:52:46.883615 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:52:46.889096 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:52:46.889078 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:57:46.909338 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:57:46.909311 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:57:46.914269 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:57:46.914254 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 18:58:35.457798 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:35.457761 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7"] Apr 16 18:58:35.458275 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:35.458044 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" containerID="cri-o://6d8e55ede5d5613b5dd3ae3a8d0db0b12810f028997dd17bbe4bfa2f75fd87c7" gracePeriod=30 Apr 16 18:58:38.382946 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:38.382905 2570 generic.go:358] "Generic (PLEG): container finished" podID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerID="6d8e55ede5d5613b5dd3ae3a8d0db0b12810f028997dd17bbe4bfa2f75fd87c7" exitCode=0 Apr 16 18:58:38.383348 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:38.382969 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" event={"ID":"392b7c5d-9a36-4cd4-a37f-e03215fff699","Type":"ContainerDied","Data":"6d8e55ede5d5613b5dd3ae3a8d0db0b12810f028997dd17bbe4bfa2f75fd87c7"} Apr 16 18:58:38.501087 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:38.501045 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" Apr 16 18:58:39.387197 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:39.387171 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" Apr 16 18:58:39.387574 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:39.387170 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7" event={"ID":"392b7c5d-9a36-4cd4-a37f-e03215fff699","Type":"ContainerDied","Data":"b2044013daa413f47257b37d42579165f3bea9d35db181461c626af66cbd5816"} Apr 16 18:58:39.387574 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:39.387286 2570 scope.go:117] "RemoveContainer" containerID="6d8e55ede5d5613b5dd3ae3a8d0db0b12810f028997dd17bbe4bfa2f75fd87c7" Apr 16 18:58:39.406870 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:39.406845 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7"] Apr 16 18:58:39.411125 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:39.411107 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56ec1-predictor-f476465cd-bqvv7"] Apr 16 18:58:40.838493 ip-10-0-128-74 kubenswrapper[2570]: I0416 18:58:40.838452 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" path="/var/lib/kubelet/pods/392b7c5d-9a36-4cd4-a37f-e03215fff699/volumes" Apr 16 19:02:46.939713 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:02:46.939600 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 19:02:46.947484 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:02:46.947465 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 19:06:09.435503 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:09.435466 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff"] Apr 16 19:06:09.436000 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:09.435695 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" containerID="cri-o://abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248" gracePeriod=30 Apr 16 19:06:11.303919 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:11.303875 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 19:06:12.483320 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.483298 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" Apr 16 19:06:12.926659 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.926624 2570 generic.go:358] "Generic (PLEG): container finished" podID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerID="abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248" exitCode=0 Apr 16 19:06:12.926842 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.926684 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" Apr 16 19:06:12.926842 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.926704 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" event={"ID":"e7af7ef2-6558-4525-be3d-8a0e4ae14439","Type":"ContainerDied","Data":"abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248"} Apr 16 19:06:12.926842 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.926742 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff" event={"ID":"e7af7ef2-6558-4525-be3d-8a0e4ae14439","Type":"ContainerDied","Data":"64c9c5b10156da014269e1fb97c0d307d4972ca3d548db0328a6d18672bd84ed"} Apr 16 19:06:12.926842 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.926761 2570 scope.go:117] "RemoveContainer" containerID="abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248" Apr 16 19:06:12.934834 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.934821 2570 scope.go:117] "RemoveContainer" containerID="abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248" Apr 16 19:06:12.935078 ip-10-0-128-74 kubenswrapper[2570]: E0416 19:06:12.935035 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248\": container with ID starting with abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248 not found: ID does not exist" containerID="abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248" Apr 16 19:06:12.935165 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.935075 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248"} err="failed to get container status \"abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248\": rpc error: code = NotFound desc = could not find container \"abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248\": container with ID starting with abaac0189ec3798d4c2afa2e69a1aacf3cbc9a470716aab3f9ca01d366ec4248 not found: ID does not exist" Apr 16 19:06:12.944471 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.944447 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff"] Apr 16 19:06:12.948243 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:12.948223 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-979d7-predictor-79bb6798c8-ssnff"] Apr 16 19:06:14.838233 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:14.838200 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" path="/var/lib/kubelet/pods/e7af7ef2-6558-4525-be3d-8a0e4ae14439/volumes" Apr 16 19:06:35.282321 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282289 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6xqvt/must-gather-zkzdq"] Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282648 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282660 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282675 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282680 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282693 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282699 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282758 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="392b7c5d-9a36-4cd4-a37f-e03215fff699" containerName="kserve-container" Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282766 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2602a1-85ca-4b15-af41-4de097fe39e7" containerName="kserve-container" Apr 16 19:06:35.282780 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.282773 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7af7ef2-6558-4525-be3d-8a0e4ae14439" containerName="kserve-container" Apr 16 19:06:35.285851 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.285833 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqvt/must-gather-zkzdq" Apr 16 19:06:35.290455 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.290433 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6xqvt\"/\"default-dockercfg-4xt97\"" Apr 16 19:06:35.290584 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.290433 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6xqvt\"/\"openshift-service-ca.crt\"" Apr 16 19:06:35.291422 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.291402 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6xqvt\"/\"kube-root-ca.crt\"" Apr 16 19:06:35.297368 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.297340 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqvt/must-gather-zkzdq"] Apr 16 19:06:35.415314 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.415280 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c98ac26d-a8de-4ee5-9232-b4f0ef958da9-must-gather-output\") pod \"must-gather-zkzdq\" (UID: \"c98ac26d-a8de-4ee5-9232-b4f0ef958da9\") " pod="openshift-must-gather-6xqvt/must-gather-zkzdq" Apr 16 19:06:35.415512 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.415338 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4mzg\" (UniqueName: \"kubernetes.io/projected/c98ac26d-a8de-4ee5-9232-b4f0ef958da9-kube-api-access-r4mzg\") pod \"must-gather-zkzdq\" (UID: \"c98ac26d-a8de-4ee5-9232-b4f0ef958da9\") " pod="openshift-must-gather-6xqvt/must-gather-zkzdq" Apr 16 19:06:35.516095 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.516037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c98ac26d-a8de-4ee5-9232-b4f0ef958da9-must-gather-output\") pod \"must-gather-zkzdq\" (UID: \"c98ac26d-a8de-4ee5-9232-b4f0ef958da9\") " pod="openshift-must-gather-6xqvt/must-gather-zkzdq" Apr 16 19:06:35.516253 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.516121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4mzg\" (UniqueName: \"kubernetes.io/projected/c98ac26d-a8de-4ee5-9232-b4f0ef958da9-kube-api-access-r4mzg\") pod \"must-gather-zkzdq\" (UID: \"c98ac26d-a8de-4ee5-9232-b4f0ef958da9\") " pod="openshift-must-gather-6xqvt/must-gather-zkzdq" Apr 16 19:06:35.516394 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.516374 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c98ac26d-a8de-4ee5-9232-b4f0ef958da9-must-gather-output\") pod \"must-gather-zkzdq\" (UID: \"c98ac26d-a8de-4ee5-9232-b4f0ef958da9\") " pod="openshift-must-gather-6xqvt/must-gather-zkzdq" Apr 16 19:06:35.527357 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.527330 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4mzg\" (UniqueName: \"kubernetes.io/projected/c98ac26d-a8de-4ee5-9232-b4f0ef958da9-kube-api-access-r4mzg\") pod \"must-gather-zkzdq\" (UID: \"c98ac26d-a8de-4ee5-9232-b4f0ef958da9\") " pod="openshift-must-gather-6xqvt/must-gather-zkzdq" Apr 16 19:06:35.603951 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.603868 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqvt/must-gather-zkzdq" Apr 16 19:06:35.725890 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.725861 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqvt/must-gather-zkzdq"] Apr 16 19:06:35.727765 ip-10-0-128-74 kubenswrapper[2570]: W0416 19:06:35.727725 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc98ac26d_a8de_4ee5_9232_b4f0ef958da9.slice/crio-53a735ca52fed19c26b1adf1b50ad2541e922178ed4b1cb274eabfaec82e694d WatchSource:0}: Error finding container 53a735ca52fed19c26b1adf1b50ad2541e922178ed4b1cb274eabfaec82e694d: Status 404 returned error can't find the container with id 53a735ca52fed19c26b1adf1b50ad2541e922178ed4b1cb274eabfaec82e694d Apr 16 19:06:35.729618 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:35.729603 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:06:36.006354 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:36.006266 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/must-gather-zkzdq" event={"ID":"c98ac26d-a8de-4ee5-9232-b4f0ef958da9","Type":"ContainerStarted","Data":"53a735ca52fed19c26b1adf1b50ad2541e922178ed4b1cb274eabfaec82e694d"} Apr 16 19:06:37.011911 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:37.011858 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/must-gather-zkzdq" event={"ID":"c98ac26d-a8de-4ee5-9232-b4f0ef958da9","Type":"ContainerStarted","Data":"1e290d4f4670478cb314959a7aaafc97b47facd9e2efb173dee87255f689ea40"} Apr 16 19:06:37.011911 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:37.011905 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/must-gather-zkzdq" event={"ID":"c98ac26d-a8de-4ee5-9232-b4f0ef958da9","Type":"ContainerStarted","Data":"b105304eeb71feb3eb6b39f3efab99df124b9714b6ead30eadc5c6c2964e1137"} Apr 16 19:06:37.033458 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:37.033399 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6xqvt/must-gather-zkzdq" podStartSLOduration=1.139885562 podStartE2EDuration="2.033379554s" podCreationTimestamp="2026-04-16 19:06:35 +0000 UTC" firstStartedPulling="2026-04-16 19:06:35.729720256 +0000 UTC m=+2929.509184565" lastFinishedPulling="2026-04-16 19:06:36.623214245 +0000 UTC m=+2930.402678557" observedRunningTime="2026-04-16 19:06:37.030351149 +0000 UTC m=+2930.809815493" watchObservedRunningTime="2026-04-16 19:06:37.033379554 +0000 UTC m=+2930.812843886" Apr 16 19:06:38.138742 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:38.138712 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-l5lq6_90529082-bc95-49c6-a8ff-05a611805241/global-pull-secret-syncer/0.log" Apr 16 19:06:38.294793 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:38.294763 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rlk99_0aa834c9-7b5e-44dc-a706-cf8d7ff11391/konnectivity-agent/0.log" Apr 16 19:06:38.364357 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:38.364328 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-74.ec2.internal_4d5b4fccf018b349362d7b27ad7bd6e5/haproxy/0.log" Apr 16 19:06:41.624666 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:41.624567 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_50d4252d-1296-4c50-8912-925ab6d41f3c/alertmanager/0.log" Apr 16 19:06:41.678815 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:41.678788 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_50d4252d-1296-4c50-8912-925ab6d41f3c/config-reloader/0.log" Apr 16 19:06:41.708339 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:41.708311 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_50d4252d-1296-4c50-8912-925ab6d41f3c/kube-rbac-proxy-web/0.log" Apr 16 19:06:41.736451 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:41.736424 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_50d4252d-1296-4c50-8912-925ab6d41f3c/kube-rbac-proxy/0.log" Apr 16 19:06:41.767089 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:41.767044 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_50d4252d-1296-4c50-8912-925ab6d41f3c/kube-rbac-proxy-metric/0.log" Apr 16 19:06:41.794880 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:41.794855 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_50d4252d-1296-4c50-8912-925ab6d41f3c/prom-label-proxy/0.log" Apr 16 19:06:41.823835 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:41.823806 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_50d4252d-1296-4c50-8912-925ab6d41f3c/init-config-reloader/0.log" Apr 16 19:06:41.882582 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:41.882507 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-s6xhl_ca4f330e-8728-4c07-ab6d-127e7f77538c/cluster-monitoring-operator/0.log" Apr 16 19:06:42.022799 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.022769 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-ckt6k_9db2e2f6-0f7a-432a-b3c2-72b5f4e3be36/monitoring-plugin/0.log" Apr 16 19:06:42.139912 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.139829 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7cfsz_027aad74-c11c-4a49-8925-52c728463d0f/node-exporter/0.log" Apr 16 19:06:42.165416 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.165392 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7cfsz_027aad74-c11c-4a49-8925-52c728463d0f/kube-rbac-proxy/0.log" Apr 16 19:06:42.191538 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.191513 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7cfsz_027aad74-c11c-4a49-8925-52c728463d0f/init-textfile/0.log" Apr 16 19:06:42.307172 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.307135 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-9f44m_0432dce1-45c7-4680-9444-34c004ae03cb/kube-rbac-proxy-main/0.log" Apr 16 19:06:42.336998 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.336968 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-9f44m_0432dce1-45c7-4680-9444-34c004ae03cb/kube-rbac-proxy-self/0.log" Apr 16 19:06:42.364558 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.364526 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-9f44m_0432dce1-45c7-4680-9444-34c004ae03cb/openshift-state-metrics/0.log" Apr 16 19:06:42.654761 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.654733 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-mh224_a8800c73-8457-4de9-8473-aa5b62d40811/prometheus-operator-admission-webhook/0.log" Apr 16 19:06:42.708288 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.708264 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-fdc9b6c58-ll8c7_ae7b62ca-8697-4148-bd62-b26981e1514f/telemeter-client/0.log" Apr 16 19:06:42.749864 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.749840 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-fdc9b6c58-ll8c7_ae7b62ca-8697-4148-bd62-b26981e1514f/reload/0.log" Apr 16 19:06:42.787444 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.787413 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-fdc9b6c58-ll8c7_ae7b62ca-8697-4148-bd62-b26981e1514f/kube-rbac-proxy/0.log" Apr 16 19:06:42.834049 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.834026 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5979894575-mh9cx_620fdae5-a00a-4876-8ec2-3b53d29cdfcb/thanos-query/0.log" Apr 16 19:06:42.876638 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.876599 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5979894575-mh9cx_620fdae5-a00a-4876-8ec2-3b53d29cdfcb/kube-rbac-proxy-web/0.log" Apr 16 19:06:42.909932 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.909906 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5979894575-mh9cx_620fdae5-a00a-4876-8ec2-3b53d29cdfcb/kube-rbac-proxy/0.log" Apr 16 19:06:42.943064 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.943019 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5979894575-mh9cx_620fdae5-a00a-4876-8ec2-3b53d29cdfcb/prom-label-proxy/0.log" Apr 16 19:06:42.971738 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.971704 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5979894575-mh9cx_620fdae5-a00a-4876-8ec2-3b53d29cdfcb/kube-rbac-proxy-rules/0.log" Apr 16 19:06:42.999868 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:42.999838 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5979894575-mh9cx_620fdae5-a00a-4876-8ec2-3b53d29cdfcb/kube-rbac-proxy-metrics/0.log" Apr 16 19:06:44.661737 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.661706 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp"] Apr 16 19:06:44.665443 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.665422 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.673186 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.672753 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp"] Apr 16 19:06:44.807363 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.807331 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-proc\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.807590 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.807375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-sys\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.807590 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.807392 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-podres\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.807590 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.807517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpv7\" (UniqueName: \"kubernetes.io/projected/70fa2f9d-231f-4c3e-b613-98004d6005bd-kube-api-access-xhpv7\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.807717 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.807602 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-lib-modules\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.877752 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.877719 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-654d547795-qzsm4_00fb3776-7d26-471c-aeee-7e3b6923b9a9/console/0.log" Apr 16 19:06:44.908087 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908044 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpv7\" (UniqueName: \"kubernetes.io/projected/70fa2f9d-231f-4c3e-b613-98004d6005bd-kube-api-access-xhpv7\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.908273 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908118 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-lib-modules\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.908273 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-proc\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.908273 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908240 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-sys\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.908273 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908268 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-podres\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.908552 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-lib-modules\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.908552 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-podres\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.908552 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908436 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-proc\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.908552 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.908462 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/70fa2f9d-231f-4c3e-b613-98004d6005bd-sys\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.919597 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.919524 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpv7\" (UniqueName: \"kubernetes.io/projected/70fa2f9d-231f-4c3e-b613-98004d6005bd-kube-api-access-xhpv7\") pod \"perf-node-gather-daemonset-dhmsp\" (UID: \"70fa2f9d-231f-4c3e-b613-98004d6005bd\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:44.922040 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.922016 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-p2sjs_b1579b0c-9f23-4da7-b7d8-a42454fa0e06/download-server/0.log" Apr 16 19:06:44.977134 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:44.977100 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:45.110424 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:45.110399 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp"] Apr 16 19:06:45.112172 ip-10-0-128-74 kubenswrapper[2570]: W0416 19:06:45.112143 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod70fa2f9d_231f_4c3e_b613_98004d6005bd.slice/crio-a40ae38001a05dd2f1fb71ea0940c3d58d3ef30e638cfdf0af8f3706227fc11d WatchSource:0}: Error finding container a40ae38001a05dd2f1fb71ea0940c3d58d3ef30e638cfdf0af8f3706227fc11d: Status 404 returned error can't find the container with id a40ae38001a05dd2f1fb71ea0940c3d58d3ef30e638cfdf0af8f3706227fc11d Apr 16 19:06:46.026659 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:46.026614 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-brklz_67ddeef6-939c-4d8e-83ee-0673f748cf12/dns/0.log" Apr 16 19:06:46.050726 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:46.050695 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" event={"ID":"70fa2f9d-231f-4c3e-b613-98004d6005bd","Type":"ContainerStarted","Data":"2bbc87a98912cf2b7eef9bc0728dc7565fec21e5b0cf64df2c67a2012a280e31"} Apr 16 19:06:46.050870 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:46.050735 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" event={"ID":"70fa2f9d-231f-4c3e-b613-98004d6005bd","Type":"ContainerStarted","Data":"a40ae38001a05dd2f1fb71ea0940c3d58d3ef30e638cfdf0af8f3706227fc11d"} Apr 16 19:06:46.050870 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:46.050833 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:46.053228 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:46.053206 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-brklz_67ddeef6-939c-4d8e-83ee-0673f748cf12/kube-rbac-proxy/0.log" Apr 16 19:06:46.072382 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:46.072342 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" podStartSLOduration=2.072329895 podStartE2EDuration="2.072329895s" podCreationTimestamp="2026-04-16 19:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:06:46.071227749 +0000 UTC m=+2939.850692102" watchObservedRunningTime="2026-04-16 19:06:46.072329895 +0000 UTC m=+2939.851794224" Apr 16 19:06:46.202599 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:46.202559 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2h4fb_4735317d-b557-4ca9-84cd-02f72096e33a/dns-node-resolver/0.log" Apr 16 19:06:46.767413 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:46.767386 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s2j9l_a9622aca-ffc8-4b50-82e0-a1c82e6222df/node-ca/0.log" Apr 16 19:06:47.625363 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:47.625336 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54596cf866-vm76c_f9efc8b2-a298-4a79-a57a-811175327ee2/router/0.log" Apr 16 19:06:48.022200 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:48.022167 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-v2f4d_935e77e2-8cb8-4a46-ac22-24ad0a5b649a/serve-healthcheck-canary/0.log" Apr 16 19:06:48.470721 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:48.470649 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bzcpl_0737d249-a705-41ad-b1ee-04446e7bdfce/kube-rbac-proxy/0.log" Apr 16 19:06:48.493781 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:48.493751 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bzcpl_0737d249-a705-41ad-b1ee-04446e7bdfce/exporter/0.log" Apr 16 19:06:48.520171 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:48.520145 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bzcpl_0737d249-a705-41ad-b1ee-04446e7bdfce/extractor/0.log" Apr 16 19:06:50.733990 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:50.733960 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-vdng7_119dc07f-5f1f-4102-87c6-4a6a342dea03/manager/0.log" Apr 16 19:06:50.785203 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:50.785175 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-4vnfs_2d0e858e-6c95-497e-b9ec-1101bf4152d5/server/0.log" Apr 16 19:06:51.290099 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:51.290069 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-dhh97_173854b1-651f-42a8-9473-ea555cff0ced/seaweedfs/0.log" Apr 16 19:06:52.063644 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:52.063619 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-dhmsp" Apr 16 19:06:55.500266 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:55.500234 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-5nbn9_b117bdfd-44c4-486e-ae7f-b512781456f8/migrator/0.log" Apr 16 19:06:55.527149 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:55.527120 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-5nbn9_b117bdfd-44c4-486e-ae7f-b512781456f8/graceful-termination/0.log" Apr 16 19:06:55.900560 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:55.900529 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-wk6td_5fa9d23f-acec-46e2-b6bc-3203fdd2764d/kube-storage-version-migrator-operator/1.log" Apr 16 19:06:55.902332 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:55.902298 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-wk6td_5fa9d23f-acec-46e2-b6bc-3203fdd2764d/kube-storage-version-migrator-operator/0.log" Apr 16 19:06:56.824123 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:56.824092 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cmxs_1b45981e-9576-4b1b-b941-35f68d109c84/kube-multus/0.log" Apr 16 19:06:57.216701 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.216632 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qzpw4_a3821f1e-3cf4-4526-9175-97c1251899f2/kube-multus-additional-cni-plugins/0.log" Apr 16 19:06:57.241589 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.241564 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qzpw4_a3821f1e-3cf4-4526-9175-97c1251899f2/egress-router-binary-copy/0.log" Apr 16 19:06:57.267765 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.267742 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qzpw4_a3821f1e-3cf4-4526-9175-97c1251899f2/cni-plugins/0.log" Apr 16 19:06:57.292083 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.292039 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qzpw4_a3821f1e-3cf4-4526-9175-97c1251899f2/bond-cni-plugin/0.log" Apr 16 19:06:57.318638 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.318609 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qzpw4_a3821f1e-3cf4-4526-9175-97c1251899f2/routeoverride-cni/0.log" Apr 16 19:06:57.344797 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.344766 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qzpw4_a3821f1e-3cf4-4526-9175-97c1251899f2/whereabouts-cni-bincopy/0.log" Apr 16 19:06:57.370764 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.370742 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qzpw4_a3821f1e-3cf4-4526-9175-97c1251899f2/whereabouts-cni/0.log" Apr 16 19:06:57.433045 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.433012 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dvxrp_edeb92c2-9fa4-40ae-bb1a-a24372d25c5e/network-metrics-daemon/0.log" Apr 16 19:06:57.453523 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:57.453498 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dvxrp_edeb92c2-9fa4-40ae-bb1a-a24372d25c5e/kube-rbac-proxy/0.log" Apr 16 19:06:59.115410 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.115380 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-controller/0.log" Apr 16 19:06:59.135506 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.135470 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/0.log" Apr 16 19:06:59.162035 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.162011 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovn-acl-logging/1.log" Apr 16 19:06:59.188275 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.188253 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/kube-rbac-proxy-node/0.log" Apr 16 19:06:59.213457 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.213433 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:06:59.232132 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.232105 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/northd/0.log" Apr 16 19:06:59.255760 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.255741 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/nbdb/0.log" Apr 16 19:06:59.280120 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.280095 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/sbdb/0.log" Apr 16 19:06:59.455195 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:06:59.455125 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zps8z_8cc82835-e3e6-46d3-8f2f-ead7027b1b91/ovnkube-controller/0.log" Apr 16 19:07:00.480932 ip-10-0-128-74 kubenswrapper[2570]: I0416 19:07:00.480903 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-m54zx_ce22102c-2dd2-4a4f-8317-5733e81186d1/network-check-target-container/0.log"