Apr 16 19:53:55.467926 ip-10-0-139-205 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:55.898735 ip-10-0-139-205 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:55.898735 ip-10-0-139-205 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:55.898735 ip-10-0-139-205 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:55.898735 ip-10-0-139-205 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:55.898735 ip-10-0-139-205 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:55.900311 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.900214 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:55.902618 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902599 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:55.902618 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902617 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:55.902618 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902620 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902626 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902631 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902635 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902639 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902642 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902645 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902648 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902651 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902654 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902657 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902660 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902662 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902666 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902669 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902671 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902674 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902677 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902680 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:55.902716 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902682 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902686 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902690 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902692 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902695 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902698 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902701 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902703 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902706 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902708 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902711 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902714 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902716 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902719 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902722 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902724 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902728 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902730 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902733 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902735 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:55.903188 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902738 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902740 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902743 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902745 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902748 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902751 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902754 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902757 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902759 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902761 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902764 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902766 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902769 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902772 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902776 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902779 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902783 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902787 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902790 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:55.903698 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902792 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902795 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902798 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902801 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902803 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902806 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902809 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902811 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902814 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902816 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902819 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902821 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902824 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902827 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902830 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902832 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902835 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902837 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902840 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902842 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:55.904183 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902845 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902847 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902851 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902854 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902856 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.902859 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904283 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904292 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904295 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904298 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904301 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904304 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904307 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904310 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904313 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904315 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904318 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904320 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904323 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904326 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:55.904704 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904329 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904331 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904333 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904336 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904339 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904342 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904345 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904347 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904350 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904352 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904355 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904358 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904360 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904363 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904367 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904370 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904372 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904375 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904378 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:55.905234 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904380 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904383 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904386 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904389 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904391 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904395 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904399 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904402 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904405 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904408 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904411 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904413 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904416 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904418 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904421 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904423 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904426 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904428 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904431 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:55.905736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904434 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904439 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904442 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904445 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904448 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904451 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904453 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904456 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904459 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904462 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904464 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904467 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904470 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904473 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904475 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904478 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904481 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904483 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904486 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904488 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:55.906215 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904491 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904495 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904497 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904500 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904502 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904505 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904507 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904510 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904512 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904515 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904518 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904520 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904523 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.904526 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905868 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905880 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905886 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905891 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905896 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905900 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905906 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:55.906721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905911 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905915 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905918 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905921 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905925 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905928 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905931 2569 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905934 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905937 2569 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905940 2569 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905943 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905946 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905951 2569 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905954 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905957 2569 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905960 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905963 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905968 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905971 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905974 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905978 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905981 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905985 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905988 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905991 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:55.907246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.905995 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906000 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906003 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906006 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906008 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906012 2569 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906017 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906021 2569 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906025 2569 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906028 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906031 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906034 2569 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906038 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906041 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906045 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906048 2569 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906051 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906054 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906057 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906060 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906064 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906067 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906070 2569 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906074 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906077 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:55.907894 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906081 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906085 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906088 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906091 2569 flags.go:64] FLAG: --help="false" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906095 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-139-205.ec2.internal" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906098 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906102 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906105 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906109 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906112 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906115 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906118 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906121 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906125 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906129 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906132 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906135 2569 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906138 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906141 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906144 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906147 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906151 2569 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906154 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906157 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:55.908650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906161 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906166 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906169 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906172 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906175 2569 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906178 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906182 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906185 2569 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906188 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906193 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906196 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906200 2569 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906203 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906206 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906209 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906212 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906216 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906220 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906223 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906232 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906236 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906239 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906242 2569 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:55.909232 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906246 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906253 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906256 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906259 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906262 2569 flags.go:64] FLAG: --port="10250" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906265 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906268 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e9b89f8434ac1c48" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906272 2569 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906275 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906278 2569 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906281 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906284 2569 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906288 2569 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906291 2569 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906294 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906296 2569 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906300 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906304 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906307 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906310 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906313 2569 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906316 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906319 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906322 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906325 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906328 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:55.909803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906331 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906335 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906338 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906341 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906344 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906347 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906350 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906353 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906356 2569 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906359 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906365 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906368 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906371 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906376 2569 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906379 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906381 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906384 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906387 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906390 2569 flags.go:64] FLAG: --v="2" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906399 2569 flags.go:64] FLAG: --version="false" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906403 2569 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906408 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.906411 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906521 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906525 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:55.910490 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906528 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906531 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906534 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906537 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906541 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906543 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906546 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906549 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906552 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906554 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906557 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906560 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906563 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906565 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906568 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906571 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906585 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906588 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906590 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906593 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:55.911112 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906595 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906599 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906602 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906605 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906607 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906610 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906612 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906615 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906618 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906620 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906623 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906626 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906628 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906631 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906634 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906637 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906639 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906643 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906646 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:55.911641 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906650 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906653 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906656 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906659 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906663 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906667 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906670 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906673 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906676 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906678 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906681 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906684 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906687 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906690 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906693 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906695 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906698 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906701 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906704 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:55.912126 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906706 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906709 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906711 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906714 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906717 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906719 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906722 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906725 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906731 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906734 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906737 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906740 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906743 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906745 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906751 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906754 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906757 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906760 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906763 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:55.912630 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906765 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:55.913405 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906768 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:55.913405 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906770 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:55.913405 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906773 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:55.913405 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906776 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:55.913405 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906778 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:55.913405 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.906781 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:55.913405 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.907330 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:55.915002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.914977 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:55.915002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.915004 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915080 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915089 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915093 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915098 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915103 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915108 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915112 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915118 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915123 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915128 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915132 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915137 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915142 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915146 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915153 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:55.915153 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915161 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915165 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915170 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915174 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915178 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915183 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915187 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915191 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915195 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915199 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915204 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915208 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915212 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915216 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915220 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915224 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915230 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915234 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915238 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915242 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:55.915925 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915247 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915251 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915255 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915259 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915263 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915267 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915271 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915275 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915279 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915283 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915288 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915292 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915296 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915301 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915306 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915311 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915316 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915320 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915324 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915329 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:55.916707 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915333 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915337 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915341 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915345 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915349 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915354 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915360 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915365 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915369 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915374 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915378 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915382 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915386 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915391 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915395 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915400 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915404 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915409 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915413 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915417 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:55.917325 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915421 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915425 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915431 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915435 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915440 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915444 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915449 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915454 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915458 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915462 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915466 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.915475 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915669 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915679 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915684 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:55.918075 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915689 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915693 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915698 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915702 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915706 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915711 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915715 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915721 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915725 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915729 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915733 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915738 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915742 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915746 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915750 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915755 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915759 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915763 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915768 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915772 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:55.918729 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915777 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915781 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915785 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915790 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915795 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915799 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915803 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915808 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915812 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915817 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915821 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915826 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915831 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915837 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915842 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915847 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915852 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915857 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915861 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915866 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:55.919395 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915871 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915876 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915880 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915884 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915888 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915892 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915897 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915902 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915906 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915910 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915914 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915919 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915923 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915927 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915931 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915935 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915939 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915943 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915948 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915952 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:55.920090 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915957 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915961 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915965 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915969 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915976 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915982 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915987 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915992 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.915997 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916001 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916006 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916010 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916015 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916020 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916024 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916028 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916032 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916036 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916040 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:55.920648 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916044 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916049 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916053 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:55.916058 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.916066 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.916868 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.919755 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.920730 2569 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.920837 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:55.921184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.920883 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:55.946800 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.946769 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:55.949237 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.949203 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:55.964404 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.964377 2569 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:55.970170 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.970145 2569 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:55.971462 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.971436 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:55.973137 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.973119 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:55.975716 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.975693 2569 fs.go:135] Filesystem UUIDs: map[4b05100e-e0dc-4cdf-9890-717a153bdd57:/dev/nvme0n1p3 6090ebb0-ef95-496c-ab79-6b76957ee099:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 19:53:55.975782 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.975715 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:55.981536 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.981406 2569 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:55.979548267 +0000 UTC m=+0.394123117 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099246 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29776dd270cd626b96e84003f3d09e SystemUUID:ec29776d-d270-cd62-6b96-e84003f3d09e BootID:cf1ada18-ffc0-4ed6-ac74-05b7526927c0 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:42:fb:26:33:b9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:42:fb:26:33:b9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:3f:33:78:6d:e4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:55.981536 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.981529 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:55.981709 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.981694 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:55.984139 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.984104 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:55.984299 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.984141 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-205.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:55.984352 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.984310 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:55.984352 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.984320 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:55.984352 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.984334 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:55.985279 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.985266 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:55.986021 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.986009 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:55.986152 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.986143 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:55.988253 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.988238 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:55.988291 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.988260 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:55.988291 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.988277 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:55.988291 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.988290 2569 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:55.988382 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.988310 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:55.989479 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.989466 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:55.989524 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.989486 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:55.993082 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.993064 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:55.994864 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.994848 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:55.996422 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996407 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:55.996470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996427 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:55.996470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996447 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:55.996470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996457 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:55.996470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996464 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:55.996470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996471 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:55.996627 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996477 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:55.996627 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996483 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:55.996627 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996490 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:55.996627 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996497 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:55.996627 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996510 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:55.996627 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.996520 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:55.998152 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.998134 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:55.998280 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:55.998262 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:56.001727 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.001700 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-205.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:53:56.001847 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.001789 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:56.001903 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.001841 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-205.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:56.003086 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.003069 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:56.003162 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.003122 2569 server.go:1295] "Started kubelet" Apr 16 19:53:56.003233 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.003199 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:56.003266 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.003209 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:56.003295 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.003275 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:56.004004 ip-10-0-139-205 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:56.004841 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.004827 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:56.006756 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.006739 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:56.007755 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.006954 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-205.ec2.internal.18a6ee6895e55804 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-205.ec2.internal,UID:ip-10-0-139-205.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-205.ec2.internal,},FirstTimestamp:2026-04-16 19:53:56.003084292 +0000 UTC m=+0.417659130,LastTimestamp:2026-04-16 19:53:56.003084292 +0000 UTC m=+0.417659130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-205.ec2.internal,}" Apr 16 19:53:56.009951 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.009919 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:56.011318 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.010449 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:56.012076 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.012055 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:56.012076 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.012066 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:56.012076 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.012079 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:56.012267 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.012181 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:56.012267 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.012195 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:56.012355 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.012290 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.012918 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.012896 2569 factory.go:55] Registering systemd factory Apr 16 19:53:56.012918 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.012920 2569 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:56.013308 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.013281 2569 factory.go:153] Registering CRI-O factory Apr 16 19:53:56.013308 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.013300 2569 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:56.013450 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.013427 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:56.013507 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.013459 2569 factory.go:103] Registering Raw factory Apr 16 19:53:56.013507 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.013477 2569 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:56.013945 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.013928 2569 manager.go:319] Starting recovery of all containers Apr 16 19:53:56.014224 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.014162 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:56.018072 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.018046 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pkwn8" Apr 16 19:53:56.024166 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.024130 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:53:56.024380 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.024363 2569 manager.go:324] Recovery completed Apr 16 19:53:56.026022 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.025994 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pkwn8" Apr 16 19:53:56.026142 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.026021 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-205.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:53:56.029911 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.029897 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.032486 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.032468 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.032543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.032500 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.032543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.032511 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.033129 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.033110 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:56.033129 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.033129 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:56.033242 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.033176 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:56.035115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.035102 2569 policy_none.go:49] "None policy: Start" Apr 16 19:53:56.035166 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.035119 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:56.035166 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.035130 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:56.067314 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.067291 2569 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:56.067462 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.067329 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:56.067462 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.067340 2569 server.go:85] "Starting device plugin registration server" Apr 16 19:53:56.067656 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.067640 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:56.067721 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.067655 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:56.067788 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.067772 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:56.067956 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.067867 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:56.067956 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.067877 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:56.068428 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.068405 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:56.068512 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.068451 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.168230 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.168137 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.169285 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.169178 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.169285 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.169221 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.169285 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.169239 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.169285 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.169278 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.177075 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.177050 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:56.178372 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.178334 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.178372 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.178367 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-205.ec2.internal\": node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.178530 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.178395 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:56.178530 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.178420 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:56.178530 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.178440 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:56.178530 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.178450 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:56.178530 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.178487 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:56.180822 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.180800 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:56.218564 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.218527 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.279200 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.279153 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal"] Apr 16 19:53:56.279270 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.279255 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.280287 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.280270 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.280390 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.280303 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.280390 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.280315 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.281481 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.281463 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.281614 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.281599 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.281670 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.281627 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.282293 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.282274 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.282293 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.282277 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.282426 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.282312 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.282426 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.282315 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.282426 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.282323 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.282426 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.282329 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.283401 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.283385 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.283484 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.283410 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:56.284079 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.284060 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:56.284167 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.284092 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:56.284167 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.284107 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:56.308373 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.308338 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-205.ec2.internal\" not found" node="ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.312757 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.312737 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-205.ec2.internal\" not found" node="ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.313845 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.313828 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3df082ef8b5680569a751fc193a13767-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal\" (UID: \"3df082ef8b5680569a751fc193a13767\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.313905 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.313853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3df082ef8b5680569a751fc193a13767-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal\" (UID: \"3df082ef8b5680569a751fc193a13767\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.313905 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.313871 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/05778b47a5ef098ec7584b65b41dff7a-config\") pod \"kube-apiserver-proxy-ip-10-0-139-205.ec2.internal\" (UID: \"05778b47a5ef098ec7584b65b41dff7a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.319355 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.319334 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.415006 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.414975 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3df082ef8b5680569a751fc193a13767-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal\" (UID: \"3df082ef8b5680569a751fc193a13767\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.415095 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.415006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3df082ef8b5680569a751fc193a13767-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal\" (UID: \"3df082ef8b5680569a751fc193a13767\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.415095 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.415062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3df082ef8b5680569a751fc193a13767-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal\" (UID: \"3df082ef8b5680569a751fc193a13767\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.415095 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.415086 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/05778b47a5ef098ec7584b65b41dff7a-config\") pod \"kube-apiserver-proxy-ip-10-0-139-205.ec2.internal\" (UID: \"05778b47a5ef098ec7584b65b41dff7a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.415195 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.415110 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/05778b47a5ef098ec7584b65b41dff7a-config\") pod \"kube-apiserver-proxy-ip-10-0-139-205.ec2.internal\" (UID: \"05778b47a5ef098ec7584b65b41dff7a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.415195 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.415137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3df082ef8b5680569a751fc193a13767-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal\" (UID: \"3df082ef8b5680569a751fc193a13767\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.420083 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.420029 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.520838 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.520810 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.610064 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.610029 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.615030 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.615011 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" Apr 16 19:53:56.621822 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.621798 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.722516 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.722427 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.822991 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.822945 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:56.920500 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.920460 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:56.921178 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:56.920671 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:56.923613 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:56.923596 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:57.010207 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.010127 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:57.021530 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.021496 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:57.023908 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:57.023883 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:57.028072 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.028033 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:56 +0000 UTC" deadline="2027-11-28 15:05:59.502860932 +0000 UTC" Apr 16 19:53:57.028072 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.028063 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14179h12m2.47480047s" Apr 16 19:53:57.043229 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.043207 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9cw5p" Apr 16 19:53:57.051424 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.051391 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9cw5p" Apr 16 19:53:57.124660 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:57.124621 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:57.136922 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:57.136888 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3df082ef8b5680569a751fc193a13767.slice/crio-a64dab8d52d26b23d981f60a69812ef5c8d19b523177d4150483a55c02aea88e WatchSource:0}: Error finding container a64dab8d52d26b23d981f60a69812ef5c8d19b523177d4150483a55c02aea88e: Status 404 returned error can't find the container with id a64dab8d52d26b23d981f60a69812ef5c8d19b523177d4150483a55c02aea88e Apr 16 19:53:57.137150 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:57.137138 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05778b47a5ef098ec7584b65b41dff7a.slice/crio-afe7f673e029062e1572c26ea5c30b81758bedc279ec77ee8623f6b055a3a162 WatchSource:0}: Error finding container afe7f673e029062e1572c26ea5c30b81758bedc279ec77ee8623f6b055a3a162: Status 404 returned error can't find the container with id afe7f673e029062e1572c26ea5c30b81758bedc279ec77ee8623f6b055a3a162 Apr 16 19:53:57.141863 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.141845 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:57.179490 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.179455 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.181498 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.181455 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" event={"ID":"05778b47a5ef098ec7584b65b41dff7a","Type":"ContainerStarted","Data":"afe7f673e029062e1572c26ea5c30b81758bedc279ec77ee8623f6b055a3a162"} Apr 16 19:53:57.182400 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.182375 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" event={"ID":"3df082ef8b5680569a751fc193a13767","Type":"ContainerStarted","Data":"a64dab8d52d26b23d981f60a69812ef5c8d19b523177d4150483a55c02aea88e"} Apr 16 19:53:57.203536 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.203518 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.225612 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:57.225561 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:57.326199 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:57.326101 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:57.426653 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:57.426619 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-205.ec2.internal\" not found" Apr 16 19:53:57.450237 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.450199 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.511688 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.511644 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" Apr 16 19:53:57.522829 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.522795 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:57.523821 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.523797 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" Apr 16 19:53:57.535678 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.535623 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:57.989220 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.989186 2569 apiserver.go:52] "Watching apiserver" Apr 16 19:53:57.996330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.996304 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:57.997868 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.997837 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-pxv56","openshift-image-registry/node-ca-fphk8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal","openshift-network-diagnostics/network-check-target-d724r","openshift-network-operator/iptables-alerter-b4wgn","kube-system/global-pull-secret-syncer-zkxnw","openshift-multus/multus-additional-cni-plugins-2987n","openshift-multus/multus-km4f6","openshift-multus/network-metrics-daemon-8jmq5","openshift-ovn-kubernetes/ovnkube-node-pp78x","kube-system/konnectivity-agent-qcjrs","kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd"] Apr 16 19:53:57.999723 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:57.999702 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.000718 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.000696 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.002417 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.002391 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:53:58.002533 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.002469 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:53:58.003736 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.003711 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.005113 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.005089 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.006035 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.006013 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.006350 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.006269 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.006350 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.006311 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:58.006497 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.006425 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dc4zp\"" Apr 16 19:53:58.006561 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.006543 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:58.007822 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.007610 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:53:58.007822 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.007678 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:53:58.008851 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.008833 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.009236 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.009196 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:58.009401 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.009386 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.009629 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.009559 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:58.009937 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.009902 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:58.009937 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.009913 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.010421 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.010399 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:53:58.013571 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.013262 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.013571 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.013365 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:58.014329 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.013849 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.014329 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.013951 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.014329 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.014093 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.014329 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.014147 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:58.014329 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.014200 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:58.014329 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.014291 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.015525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.015500 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:58.015654 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.015550 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xsjdh\"" Apr 16 19:53:58.015718 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.015669 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gklx6\"" Apr 16 19:53:58.015718 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.015688 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.015877 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.015859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r72cj\"" Apr 16 19:53:58.015935 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.015691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.015935 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.015916 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.016031 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.015987 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:53:58.016086 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.016059 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.016493 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.016458 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:58.016602 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.016493 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:58.016602 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.016498 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9tprm\"" Apr 16 19:53:58.016901 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.016884 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zhggq\"" Apr 16 19:53:58.016979 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.016969 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d5svt\"" Apr 16 19:53:58.018072 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.018049 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:58.019364 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.019342 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:58.021669 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.021650 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:58.021892 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.021877 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-89t4j\"" Apr 16 19:53:58.022103 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.022085 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:58.023870 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.023831 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkl76\" (UniqueName: \"kubernetes.io/projected/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-kube-api-access-vkl76\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.023978 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.023877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-cni-multus\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.023978 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.023914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-hostroot\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.023978 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.023953 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-socket-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.023978 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.023972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7a86d58-955a-4af8-9ae0-c6e786f43b28-kubelet-config\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.023987 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7a86d58-955a-4af8-9ae0-c6e786f43b28-dbus\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gbtv\" (UniqueName: \"kubernetes.io/projected/60894bbb-9d97-4deb-b1de-d69609701101-kube-api-access-5gbtv\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024031 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovn-node-metrics-cert\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024055 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b5d2585-0759-49e0-8726-9b1f8902ebcf-serviceca\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024094 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-k8s-cni-cncf-io\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-registration-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024133 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-sys-fs\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024181 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2rkp\" (UniqueName: \"kubernetes.io/projected/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-kube-api-access-w2rkp\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysctl-d\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024289 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysctl-conf\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-lib-modules\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.024330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024334 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-system-cni-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024366 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-cni-bin\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-kubernetes\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-etc-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024454 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-cnibin\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024549 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-kubelet\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/89ed11af-8c8e-432e-9b8a-3696d6697184-konnectivity-ca\") pod \"konnectivity-agent-qcjrs\" (UID: \"89ed11af-8c8e-432e-9b8a-3696d6697184\") " pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-run-ovn-kubernetes\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024643 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b5d2585-0759-49e0-8726-9b1f8902ebcf-host\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-cni-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024706 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c49a39d-084e-4d78-9c37-a08591619477-cni-binary-copy\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024729 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-sys\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024770 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-env-overrides\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024824 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-var-lib-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.024885 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024890 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-netns\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024907 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024952 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-os-release\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024974 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-slash\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.024997 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-cni-bin\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-os-release\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025044 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-etc-selinux\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5bed9951-49d1-4612-b89d-05332f7e56e2-iptables-alerter-script\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-modprobe-d\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025102 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025125 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-kubelet\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025142 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-cni-netd\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovnkube-script-lib\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-669jh\" (UniqueName: \"kubernetes.io/projected/1b5d2585-0759-49e0-8726-9b1f8902ebcf-kube-api-access-669jh\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025215 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzqz\" (UniqueName: \"kubernetes.io/projected/1c49a39d-084e-4d78-9c37-a08591619477-kube-api-access-5fzqz\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025242 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysconfig\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.025525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025264 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-systemd-units\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025288 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-run-netns\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-log-socket\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025333 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025349 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zjm8\" (UniqueName: \"kubernetes.io/projected/c61ab98e-2fe3-48e1-b144-0d44e1856354-kube-api-access-7zjm8\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cnibin\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-socket-dir-parent\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025439 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-run\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-var-lib-kubelet\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025497 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-node-log\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025545 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-host\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovnkube-config\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025610 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-conf-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-multus-certs\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025660 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5bed9951-49d1-4612-b89d-05332f7e56e2-host-slash\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.026115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025748 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c49a39d-084e-4d78-9c37-a08591619477-multus-daemon-config\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/60894bbb-9d97-4deb-b1de-d69609701101-etc-tuned\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60894bbb-9d97-4deb-b1de-d69609701101-tmp\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025816 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcjw6\" (UniqueName: \"kubernetes.io/projected/10356841-c032-4d12-8328-dc3aeb909c86-kube-api-access-pcjw6\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025840 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-system-cni-dir\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025871 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-etc-kubernetes\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025893 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/89ed11af-8c8e-432e-9b8a-3696d6697184-agent-certs\") pod \"konnectivity-agent-qcjrs\" (UID: \"89ed11af-8c8e-432e-9b8a-3696d6697184\") " pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025915 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-device-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025931 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmjvx\" (UniqueName: \"kubernetes.io/projected/5bed9951-49d1-4612-b89d-05332f7e56e2-kube-api-access-dmjvx\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025945 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-systemd\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.025991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-systemd\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.026729 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.026017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-ovn\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.052918 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.052881 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:57 +0000 UTC" deadline="2028-01-20 07:40:08.518651494 +0000 UTC" Apr 16 19:53:58.052918 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.052914 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15443h46m10.46574054s" Apr 16 19:53:58.112802 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.112749 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:58.126292 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/60894bbb-9d97-4deb-b1de-d69609701101-etc-tuned\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.126292 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60894bbb-9d97-4deb-b1de-d69609701101-tmp\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.126543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcjw6\" (UniqueName: \"kubernetes.io/projected/10356841-c032-4d12-8328-dc3aeb909c86-kube-api-access-pcjw6\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:53:58.126543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.126543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-system-cni-dir\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.126543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126388 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-etc-kubernetes\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.126543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126434 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/89ed11af-8c8e-432e-9b8a-3696d6697184-agent-certs\") pod \"konnectivity-agent-qcjrs\" (UID: \"89ed11af-8c8e-432e-9b8a-3696d6697184\") " pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:53:58.126543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-device-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.126543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126499 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmjvx\" (UniqueName: \"kubernetes.io/projected/5bed9951-49d1-4612-b89d-05332f7e56e2-kube-api-access-dmjvx\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126600 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126672 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-device-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126713 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-etc-kubernetes\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126704 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126757 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-system-cni-dir\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126526 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-systemd\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-systemd\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-ovn\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126883 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-systemd\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.126899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkl76\" (UniqueName: \"kubernetes.io/projected/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-kube-api-access-vkl76\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126840 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-systemd\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-cni-multus\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126980 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-run-ovn\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.126995 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-hostroot\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127017 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-cni-multus\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-socket-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-hostroot\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127057 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7a86d58-955a-4af8-9ae0-c6e786f43b28-kubelet-config\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127081 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7a86d58-955a-4af8-9ae0-c6e786f43b28-dbus\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gbtv\" (UniqueName: \"kubernetes.io/projected/60894bbb-9d97-4deb-b1de-d69609701101-kube-api-access-5gbtv\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovn-node-metrics-cert\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127145 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-socket-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b7a86d58-955a-4af8-9ae0-c6e786f43b28-kubelet-config\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b5d2585-0759-49e0-8726-9b1f8902ebcf-serviceca\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-k8s-cni-cncf-io\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127277 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b7a86d58-955a-4af8-9ae0-c6e786f43b28-dbus\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-registration-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.127343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-k8s-cni-cncf-io\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-sys-fs\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-registration-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127387 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-sys-fs\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2rkp\" (UniqueName: \"kubernetes.io/projected/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-kube-api-access-w2rkp\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysctl-d\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127459 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysctl-conf\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-lib-modules\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127520 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-system-cni-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-cni-bin\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-kubernetes\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-etc-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysctl-conf\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127679 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-cnibin\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-kubelet\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127772 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/89ed11af-8c8e-432e-9b8a-3696d6697184-konnectivity-ca\") pod \"konnectivity-agent-qcjrs\" (UID: \"89ed11af-8c8e-432e-9b8a-3696d6697184\") " pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:53:58.128157 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-run-ovn-kubernetes\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127798 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b5d2585-0759-49e0-8726-9b1f8902ebcf-serviceca\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b5d2585-0759-49e0-8726-9b1f8902ebcf-host\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127845 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-cni-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c49a39d-084e-4d78-9c37-a08591619477-cni-binary-copy\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-sys\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127925 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-env-overrides\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.127980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-var-lib-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-netns\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysctl-d\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128096 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-cnibin\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128118 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-os-release\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-kubelet\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-slash\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-cni-bin\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.128879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128207 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128218 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-os-release\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128247 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-etc-selinux\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5bed9951-49d1-4612-b89d-05332f7e56e2-iptables-alerter-script\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-modprobe-d\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128315 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-lib-modules\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-kubelet\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-system-cni-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-cni-netd\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128402 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovnkube-script-lib\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128414 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-var-lib-cni-bin\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-669jh\" (UniqueName: \"kubernetes.io/projected/1b5d2585-0759-49e0-8726-9b1f8902ebcf-kube-api-access-669jh\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128455 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-kubernetes\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzqz\" (UniqueName: \"kubernetes.io/projected/1c49a39d-084e-4d78-9c37-a08591619477-kube-api-access-5fzqz\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128486 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysconfig\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.129612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.128493 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-etc-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-systemd-units\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-systemd-units\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-run-netns\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-log-socket\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129768 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b5d2585-0759-49e0-8726-9b1f8902ebcf-host\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zjm8\" (UniqueName: \"kubernetes.io/projected/c61ab98e-2fe3-48e1-b144-0d44e1856354-kube-api-access-7zjm8\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-cni-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cnibin\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-socket-dir-parent\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-run\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/89ed11af-8c8e-432e-9b8a-3696d6697184-konnectivity-ca\") pod \"konnectivity-agent-qcjrs\" (UID: \"89ed11af-8c8e-432e-9b8a-3696d6697184\") " pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-var-lib-kubelet\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.129987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-node-log\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-host\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-run-netns\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.130698 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-log-socket\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130235 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130388 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c49a39d-084e-4d78-9c37-a08591619477-cni-binary-copy\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-sys\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130571 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-socket-dir-parent\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130599 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-var-lib-openvswitch\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.130823 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-netns\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.131070 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-env-overrides\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.131200 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.131310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60894bbb-9d97-4deb-b1de-d69609701101-tmp\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.131337 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret podName:b7a86d58-955a-4af8-9ae0-c6e786f43b28 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.631279336 +0000 UTC m=+3.045854176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret") pod "global-pull-secret-syncer-zkxnw" (UID: "b7a86d58-955a-4af8-9ae0-c6e786f43b28") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.131352 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.131546 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.131543 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.132155 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.131611 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-os-release\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.132155 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.131673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-kubelet\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.132155 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.131803 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-run\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.132155 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.131882 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-var-lib-kubelet\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.132155 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.131961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-slash\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.132155 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132030 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-run-ovn-kubernetes\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.132155 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132075 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-cni-bin\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.132469 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132217 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-etc-selinux\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.132469 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-node-log\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.132469 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132304 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-os-release\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.132469 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-modprobe-d\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.132469 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132405 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-etc-sysconfig\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.132773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132509 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.132773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132687 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/60894bbb-9d97-4deb-b1de-d69609701101-etc-tuned\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.132773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60894bbb-9d97-4deb-b1de-d69609701101-host\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.132773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132714 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-cnibin\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.132773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovnkube-config\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-conf-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132795 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c61ab98e-2fe3-48e1-b144-0d44e1856354-host-cni-netd\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5bed9951-49d1-4612-b89d-05332f7e56e2-iptables-alerter-script\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-multus-conf-dir\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.132807 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs podName:10356841-c032-4d12-8328-dc3aeb909c86 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.632785516 +0000 UTC m=+3.047360368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs") pod "network-metrics-daemon-8jmq5" (UID: "10356841-c032-4d12-8328-dc3aeb909c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-multus-certs\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5bed9951-49d1-4612-b89d-05332f7e56e2-host-slash\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132977 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c49a39d-084e-4d78-9c37-a08591619477-host-run-multus-certs\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.132984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.133015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.133037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5bed9951-49d1-4612-b89d-05332f7e56e2-host-slash\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.133073 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.133048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c49a39d-084e-4d78-9c37-a08591619477-multus-daemon-config\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.133744 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.133508 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovnkube-config\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.133744 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.133532 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c49a39d-084e-4d78-9c37-a08591619477-multus-daemon-config\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.133851 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.133833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.134806 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.134681 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovnkube-script-lib\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.136039 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.136011 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c61ab98e-2fe3-48e1-b144-0d44e1856354-ovn-node-metrics-cert\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.136222 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.136199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/89ed11af-8c8e-432e-9b8a-3696d6697184-agent-certs\") pod \"konnectivity-agent-qcjrs\" (UID: \"89ed11af-8c8e-432e-9b8a-3696d6697184\") " pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:53:58.143784 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.142215 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:58.143784 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.142887 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:58.143784 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.142905 2569 projected.go:194] Error preparing data for projected volume kube-api-access-4sxx9 for pod openshift-network-diagnostics/network-check-target-d724r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.143784 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.143020 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9 podName:a0b5f3c5-7848-4283-b7e8-31a5e5f79888 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.642984756 +0000 UTC m=+3.057559605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4sxx9" (UniqueName: "kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9") pod "network-check-target-d724r" (UID: "a0b5f3c5-7848-4283-b7e8-31a5e5f79888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.143784 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.143412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmjvx\" (UniqueName: \"kubernetes.io/projected/5bed9951-49d1-4612-b89d-05332f7e56e2-kube-api-access-dmjvx\") pod \"iptables-alerter-b4wgn\" (UID: \"5bed9951-49d1-4612-b89d-05332f7e56e2\") " pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.143784 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.143446 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gbtv\" (UniqueName: \"kubernetes.io/projected/60894bbb-9d97-4deb-b1de-d69609701101-kube-api-access-5gbtv\") pod \"tuned-pxv56\" (UID: \"60894bbb-9d97-4deb-b1de-d69609701101\") " pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.143784 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.143510 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-669jh\" (UniqueName: \"kubernetes.io/projected/1b5d2585-0759-49e0-8726-9b1f8902ebcf-kube-api-access-669jh\") pod \"node-ca-fphk8\" (UID: \"1b5d2585-0759-49e0-8726-9b1f8902ebcf\") " pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.143784 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.143674 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcjw6\" (UniqueName: \"kubernetes.io/projected/10356841-c032-4d12-8328-dc3aeb909c86-kube-api-access-pcjw6\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:53:58.144448 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.144055 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2rkp\" (UniqueName: \"kubernetes.io/projected/e76bbd13-39a5-4247-a1ea-1de5bd9d98d8-kube-api-access-w2rkp\") pod \"aws-ebs-csi-driver-node-zdcwd\" (UID: \"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.144993 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.144954 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkl76\" (UniqueName: \"kubernetes.io/projected/f9d926bb-1dbb-44e0-981e-4bc43df8b1e0-kube-api-access-vkl76\") pod \"multus-additional-cni-plugins-2987n\" (UID: \"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0\") " pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.145604 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.145566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zjm8\" (UniqueName: \"kubernetes.io/projected/c61ab98e-2fe3-48e1-b144-0d44e1856354-kube-api-access-7zjm8\") pod \"ovnkube-node-pp78x\" (UID: \"c61ab98e-2fe3-48e1-b144-0d44e1856354\") " pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.146173 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.146154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzqz\" (UniqueName: \"kubernetes.io/projected/1c49a39d-084e-4d78-9c37-a08591619477-kube-api-access-5fzqz\") pod \"multus-km4f6\" (UID: \"1c49a39d-084e-4d78-9c37-a08591619477\") " pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.314046 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.313923 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2987n" Apr 16 19:53:58.320846 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.320811 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fphk8" Apr 16 19:53:58.333650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.333612 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b4wgn" Apr 16 19:53:58.341535 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.341504 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pxv56" Apr 16 19:53:58.350079 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.350049 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-km4f6" Apr 16 19:53:58.356817 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.356782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:53:58.363511 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.363482 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:53:58.369177 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.369151 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" Apr 16 19:53:58.485544 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.485513 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:58.636331 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.636232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:58.636331 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.636312 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:53:58.636557 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.636393 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.636557 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.636418 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.636557 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.636489 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs podName:10356841-c032-4d12-8328-dc3aeb909c86 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:59.636469991 +0000 UTC m=+4.051044835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs") pod "network-metrics-daemon-8jmq5" (UID: "10356841-c032-4d12-8328-dc3aeb909c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:58.636557 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.636507 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret podName:b7a86d58-955a-4af8-9ae0-c6e786f43b28 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:59.636497587 +0000 UTC m=+4.051072417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret") pod "global-pull-secret-syncer-zkxnw" (UID: "b7a86d58-955a-4af8-9ae0-c6e786f43b28") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.723199 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:58.723165 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b5d2585_0759_49e0_8726_9b1f8902ebcf.slice/crio-d31981f7ee75167c03bac31d7ade1cd78d2a6488fe69d43c66cd37fde95ab161 WatchSource:0}: Error finding container d31981f7ee75167c03bac31d7ade1cd78d2a6488fe69d43c66cd37fde95ab161: Status 404 returned error can't find the container with id d31981f7ee75167c03bac31d7ade1cd78d2a6488fe69d43c66cd37fde95ab161 Apr 16 19:53:58.724862 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:58.724826 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bed9951_49d1_4612_b89d_05332f7e56e2.slice/crio-d13ae779c8e8e6ea77900a568aa06782031f698995c84d3a02421a1c4b3628cf WatchSource:0}: Error finding container d13ae779c8e8e6ea77900a568aa06782031f698995c84d3a02421a1c4b3628cf: Status 404 returned error can't find the container with id d13ae779c8e8e6ea77900a568aa06782031f698995c84d3a02421a1c4b3628cf Apr 16 19:53:58.725425 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:58.725328 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d926bb_1dbb_44e0_981e_4bc43df8b1e0.slice/crio-3c8b79b305eb1bd09df79a534637ba8db057ba81eab52dbc774e1bb85883fa5c WatchSource:0}: Error finding container 3c8b79b305eb1bd09df79a534637ba8db057ba81eab52dbc774e1bb85883fa5c: Status 404 returned error can't find the container with id 3c8b79b305eb1bd09df79a534637ba8db057ba81eab52dbc774e1bb85883fa5c Apr 16 19:53:58.726505 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:58.726419 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76bbd13_39a5_4247_a1ea_1de5bd9d98d8.slice/crio-c4df2f3211d260bedb9b5a1db0fa1379b9b15a0836de74e1aa4db290327ce550 WatchSource:0}: Error finding container c4df2f3211d260bedb9b5a1db0fa1379b9b15a0836de74e1aa4db290327ce550: Status 404 returned error can't find the container with id c4df2f3211d260bedb9b5a1db0fa1379b9b15a0836de74e1aa4db290327ce550 Apr 16 19:53:58.728919 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:58.728892 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89ed11af_8c8e_432e_9b8a_3696d6697184.slice/crio-1990f514bba591af25ca3d15a8994cc492680aa9de9a1c930eb5a20284e90e01 WatchSource:0}: Error finding container 1990f514bba591af25ca3d15a8994cc492680aa9de9a1c930eb5a20284e90e01: Status 404 returned error can't find the container with id 1990f514bba591af25ca3d15a8994cc492680aa9de9a1c930eb5a20284e90e01 Apr 16 19:53:58.729836 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:58.729812 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c49a39d_084e_4d78_9c37_a08591619477.slice/crio-6989df64e6acb97e61cba1626676390bb3245e1395d43dbe0721b581a3748976 WatchSource:0}: Error finding container 6989df64e6acb97e61cba1626676390bb3245e1395d43dbe0721b581a3748976: Status 404 returned error can't find the container with id 6989df64e6acb97e61cba1626676390bb3245e1395d43dbe0721b581a3748976 Apr 16 19:53:58.730799 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:58.730780 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc61ab98e_2fe3_48e1_b144_0d44e1856354.slice/crio-ad808eb0a197ac73b3e87d0dc2a67d3f4672e84df67a3d9d3c9fe8a65a1e55c5 WatchSource:0}: Error finding container ad808eb0a197ac73b3e87d0dc2a67d3f4672e84df67a3d9d3c9fe8a65a1e55c5: Status 404 returned error can't find the container with id ad808eb0a197ac73b3e87d0dc2a67d3f4672e84df67a3d9d3c9fe8a65a1e55c5 Apr 16 19:53:58.732512 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:53:58.732362 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60894bbb_9d97_4deb_b1de_d69609701101.slice/crio-ad0bfa572925f2ecb75706f5f45ec06cce70840c89f509c84bfd0307e8d04b69 WatchSource:0}: Error finding container ad0bfa572925f2ecb75706f5f45ec06cce70840c89f509c84bfd0307e8d04b69: Status 404 returned error can't find the container with id ad0bfa572925f2ecb75706f5f45ec06cce70840c89f509c84bfd0307e8d04b69 Apr 16 19:53:58.736865 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:58.736840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:53:58.736996 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.736981 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:58.737056 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.736999 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:58.737056 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.737010 2569 projected.go:194] Error preparing data for projected volume kube-api-access-4sxx9 for pod openshift-network-diagnostics/network-check-target-d724r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.737056 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:58.737051 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9 podName:a0b5f3c5-7848-4283-b7e8-31a5e5f79888 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:59.737038592 +0000 UTC m=+4.151613418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4sxx9" (UniqueName: "kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9") pod "network-check-target-d724r" (UID: "a0b5f3c5-7848-4283-b7e8-31a5e5f79888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.053663 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.053480 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:57 +0000 UTC" deadline="2028-01-01 04:08:36.85466997 +0000 UTC" Apr 16 19:53:59.053663 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.053522 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14984h14m37.801151472s" Apr 16 19:53:59.191853 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.191781 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pxv56" event={"ID":"60894bbb-9d97-4deb-b1de-d69609701101","Type":"ContainerStarted","Data":"ad0bfa572925f2ecb75706f5f45ec06cce70840c89f509c84bfd0307e8d04b69"} Apr 16 19:53:59.198803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.198718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"ad808eb0a197ac73b3e87d0dc2a67d3f4672e84df67a3d9d3c9fe8a65a1e55c5"} Apr 16 19:53:59.202749 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.202668 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2987n" event={"ID":"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0","Type":"ContainerStarted","Data":"3c8b79b305eb1bd09df79a534637ba8db057ba81eab52dbc774e1bb85883fa5c"} Apr 16 19:53:59.208184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.208118 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qcjrs" event={"ID":"89ed11af-8c8e-432e-9b8a-3696d6697184","Type":"ContainerStarted","Data":"1990f514bba591af25ca3d15a8994cc492680aa9de9a1c930eb5a20284e90e01"} Apr 16 19:53:59.226499 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.225561 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" event={"ID":"05778b47a5ef098ec7584b65b41dff7a","Type":"ContainerStarted","Data":"e39909b75b39020cbe16a9a7a74bedfc6ebd7b7231dc0978180bf6854f0d592b"} Apr 16 19:53:59.231837 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.231764 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-km4f6" event={"ID":"1c49a39d-084e-4d78-9c37-a08591619477","Type":"ContainerStarted","Data":"6989df64e6acb97e61cba1626676390bb3245e1395d43dbe0721b581a3748976"} Apr 16 19:53:59.243371 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.243316 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" event={"ID":"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8","Type":"ContainerStarted","Data":"c4df2f3211d260bedb9b5a1db0fa1379b9b15a0836de74e1aa4db290327ce550"} Apr 16 19:53:59.250048 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.249362 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-205.ec2.internal" podStartSLOduration=2.249340764 podStartE2EDuration="2.249340764s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:53:59.24900203 +0000 UTC m=+3.663576879" watchObservedRunningTime="2026-04-16 19:53:59.249340764 +0000 UTC m=+3.663915627" Apr 16 19:53:59.252999 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.252949 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b4wgn" event={"ID":"5bed9951-49d1-4612-b89d-05332f7e56e2","Type":"ContainerStarted","Data":"d13ae779c8e8e6ea77900a568aa06782031f698995c84d3a02421a1c4b3628cf"} Apr 16 19:53:59.257209 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.257165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fphk8" event={"ID":"1b5d2585-0759-49e0-8726-9b1f8902ebcf","Type":"ContainerStarted","Data":"d31981f7ee75167c03bac31d7ade1cd78d2a6488fe69d43c66cd37fde95ab161"} Apr 16 19:53:59.643335 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.643268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:53:59.643516 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.643358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:53:59.643596 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:59.643524 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:59.643660 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:59.643602 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs podName:10356841-c032-4d12-8328-dc3aeb909c86 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.643567002 +0000 UTC m=+6.058141833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs") pod "network-metrics-daemon-8jmq5" (UID: "10356841-c032-4d12-8328-dc3aeb909c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:59.644079 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:59.644060 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:59.644161 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:59.644118 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret podName:b7a86d58-955a-4af8-9ae0-c6e786f43b28 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.644101861 +0000 UTC m=+6.058676693 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret") pod "global-pull-secret-syncer-zkxnw" (UID: "b7a86d58-955a-4af8-9ae0-c6e786f43b28") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:59.743967 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:53:59.743773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:53:59.743967 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:59.743929 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:59.743967 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:59.743947 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:59.743967 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:59.743960 2569 projected.go:194] Error preparing data for projected volume kube-api-access-4sxx9 for pod openshift-network-diagnostics/network-check-target-d724r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.744407 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:53:59.744024 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9 podName:a0b5f3c5-7848-4283-b7e8-31a5e5f79888 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.744001972 +0000 UTC m=+6.158576804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4sxx9" (UniqueName: "kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9") pod "network-check-target-d724r" (UID: "a0b5f3c5-7848-4283-b7e8-31a5e5f79888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:00.179774 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:00.179738 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:00.180330 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:00.179876 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:00.180330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:00.180289 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:00.180466 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:00.180377 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:00.180618 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:00.180587 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:00.180737 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:00.180690 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:00.283808 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:00.283712 2569 generic.go:358] "Generic (PLEG): container finished" podID="3df082ef8b5680569a751fc193a13767" containerID="3191dfb90e730944f01b2871f95ab2c1899bbcdac061e2873e4f70e7c7f1ed1c" exitCode=0 Apr 16 19:54:00.283808 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:00.283791 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" event={"ID":"3df082ef8b5680569a751fc193a13767","Type":"ContainerDied","Data":"3191dfb90e730944f01b2871f95ab2c1899bbcdac061e2873e4f70e7c7f1ed1c"} Apr 16 19:54:01.292995 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:01.292958 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" event={"ID":"3df082ef8b5680569a751fc193a13767","Type":"ContainerStarted","Data":"007b78f1080ad4ea68a8c670f5bab137d68f31067e8811de23b7ca860c4d96c2"} Apr 16 19:54:01.307710 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:01.306920 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-205.ec2.internal" podStartSLOduration=4.306898308 podStartE2EDuration="4.306898308s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:01.306559053 +0000 UTC m=+5.721133912" watchObservedRunningTime="2026-04-16 19:54:01.306898308 +0000 UTC m=+5.721473158" Apr 16 19:54:01.659227 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:01.659135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:01.659227 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:01.659187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:01.659452 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:01.659333 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.659452 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:01.659412 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret podName:b7a86d58-955a-4af8-9ae0-c6e786f43b28 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.6593906 +0000 UTC m=+10.073965432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret") pod "global-pull-secret-syncer-zkxnw" (UID: "b7a86d58-955a-4af8-9ae0-c6e786f43b28") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.659452 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:01.659336 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.659689 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:01.659471 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs podName:10356841-c032-4d12-8328-dc3aeb909c86 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.659458552 +0000 UTC m=+10.074033392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs") pod "network-metrics-daemon-8jmq5" (UID: "10356841-c032-4d12-8328-dc3aeb909c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.759899 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:01.759856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:01.760106 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:01.760089 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:01.760181 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:01.760113 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:01.760181 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:01.760126 2569 projected.go:194] Error preparing data for projected volume kube-api-access-4sxx9 for pod openshift-network-diagnostics/network-check-target-d724r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.760287 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:01.760186 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9 podName:a0b5f3c5-7848-4283-b7e8-31a5e5f79888 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:05.760168043 +0000 UTC m=+10.174742877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4sxx9" (UniqueName: "kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9") pod "network-check-target-d724r" (UID: "a0b5f3c5-7848-4283-b7e8-31a5e5f79888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.179753 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:02.179717 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:02.179940 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:02.179770 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:02.179940 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:02.179717 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:02.179940 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:02.179856 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:02.180097 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:02.179967 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:02.180097 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:02.180035 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:04.179448 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:04.179391 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:04.179969 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:04.179521 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:04.179969 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:04.179531 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:04.179969 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:04.179549 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:04.179969 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:04.179653 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:04.179969 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:04.179715 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:05.696457 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:05.696235 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:05.696457 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:05.696297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:05.696457 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:05.696379 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:05.696457 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:05.696441 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs podName:10356841-c032-4d12-8328-dc3aeb909c86 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.696423288 +0000 UTC m=+18.110998118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs") pod "network-metrics-daemon-8jmq5" (UID: "10356841-c032-4d12-8328-dc3aeb909c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:05.696457 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:05.696379 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:05.697163 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:05.696508 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret podName:b7a86d58-955a-4af8-9ae0-c6e786f43b28 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.696492972 +0000 UTC m=+18.111067803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret") pod "global-pull-secret-syncer-zkxnw" (UID: "b7a86d58-955a-4af8-9ae0-c6e786f43b28") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:05.797208 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:05.797165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:05.797399 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:05.797355 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:05.797399 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:05.797378 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:05.797399 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:05.797392 2569 projected.go:194] Error preparing data for projected volume kube-api-access-4sxx9 for pod openshift-network-diagnostics/network-check-target-d724r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:05.797569 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:05.797464 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9 podName:a0b5f3c5-7848-4283-b7e8-31a5e5f79888 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:13.797444269 +0000 UTC m=+18.212019099 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4sxx9" (UniqueName: "kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9") pod "network-check-target-d724r" (UID: "a0b5f3c5-7848-4283-b7e8-31a5e5f79888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:06.180259 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:06.179672 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:06.180259 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:06.179802 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:06.180259 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:06.180119 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:06.180259 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:06.180228 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:06.180594 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:06.180277 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:06.180594 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:06.180347 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:08.178819 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:08.178782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:08.179281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:08.178779 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:08.179281 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:08.178924 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:08.179281 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:08.178988 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:08.179281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:08.178782 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:08.179281 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:08.179060 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:10.179020 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:10.178982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:10.179020 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:10.179022 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:10.179564 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:10.179120 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:10.179564 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:10.179201 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:10.179564 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:10.179237 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:10.179564 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:10.179337 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:11.629913 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.629865 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tg5c9"] Apr 16 19:54:11.663205 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.663169 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.666271 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.666243 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:11.666426 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.666243 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:11.666485 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.666443 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x284t\"" Apr 16 19:54:11.740348 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.740315 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/448cba83-b62d-4e46-b69b-e948817d0ec5-hosts-file\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.740528 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.740360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27d8s\" (UniqueName: \"kubernetes.io/projected/448cba83-b62d-4e46-b69b-e948817d0ec5-kube-api-access-27d8s\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.740528 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.740393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/448cba83-b62d-4e46-b69b-e948817d0ec5-tmp-dir\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.841844 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.841805 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/448cba83-b62d-4e46-b69b-e948817d0ec5-hosts-file\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.841844 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.841856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27d8s\" (UniqueName: \"kubernetes.io/projected/448cba83-b62d-4e46-b69b-e948817d0ec5-kube-api-access-27d8s\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.842070 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.841949 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/448cba83-b62d-4e46-b69b-e948817d0ec5-hosts-file\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.842070 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.842007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/448cba83-b62d-4e46-b69b-e948817d0ec5-tmp-dir\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.842382 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.842358 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/448cba83-b62d-4e46-b69b-e948817d0ec5-tmp-dir\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.851650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.851608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27d8s\" (UniqueName: \"kubernetes.io/projected/448cba83-b62d-4e46-b69b-e948817d0ec5-kube-api-access-27d8s\") pod \"node-resolver-tg5c9\" (UID: \"448cba83-b62d-4e46-b69b-e948817d0ec5\") " pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:11.972405 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:11.972320 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tg5c9" Apr 16 19:54:12.179680 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:12.179625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:12.179853 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:12.179625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:12.179853 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:12.179766 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:12.179985 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:12.179856 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:12.179985 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:12.179643 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:12.179985 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:12.179951 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:13.758002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:13.757963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:13.758481 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:13.758025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:13.758481 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:13.758102 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:13.758481 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:13.758169 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:13.758481 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:13.758180 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret podName:b7a86d58-955a-4af8-9ae0-c6e786f43b28 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.758161598 +0000 UTC m=+34.172736554 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret") pod "global-pull-secret-syncer-zkxnw" (UID: "b7a86d58-955a-4af8-9ae0-c6e786f43b28") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:13.758481 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:13.758243 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs podName:10356841-c032-4d12-8328-dc3aeb909c86 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.758224206 +0000 UTC m=+34.172799037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs") pod "network-metrics-daemon-8jmq5" (UID: "10356841-c032-4d12-8328-dc3aeb909c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:13.859019 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:13.858981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:13.859204 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:13.859134 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:13.859204 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:13.859152 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:13.859204 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:13.859162 2569 projected.go:194] Error preparing data for projected volume kube-api-access-4sxx9 for pod openshift-network-diagnostics/network-check-target-d724r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:13.859323 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:13.859218 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9 podName:a0b5f3c5-7848-4283-b7e8-31a5e5f79888 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.859201545 +0000 UTC m=+34.273776406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4sxx9" (UniqueName: "kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9") pod "network-check-target-d724r" (UID: "a0b5f3c5-7848-4283-b7e8-31a5e5f79888") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:14.178919 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:14.178831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:14.179086 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:14.178831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:14.179086 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:14.178963 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:14.179086 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:14.179049 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:14.179086 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:14.178831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:14.179297 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:14.179135 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:15.629193 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:54:15.629044 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod448cba83_b62d_4e46_b69b_e948817d0ec5.slice/crio-d59e2ed689a6646ee3e661e126aaa8a23b846e46e0a941eb3533383b5c0929a2 WatchSource:0}: Error finding container d59e2ed689a6646ee3e661e126aaa8a23b846e46e0a941eb3533383b5c0929a2: Status 404 returned error can't find the container with id d59e2ed689a6646ee3e661e126aaa8a23b846e46e0a941eb3533383b5c0929a2 Apr 16 19:54:16.179734 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.179500 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:16.179877 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.179591 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:16.179877 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.179617 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:16.179877 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:16.179840 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:16.180045 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:16.179909 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:16.180045 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:16.180023 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:16.323941 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.323902 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-km4f6" event={"ID":"1c49a39d-084e-4d78-9c37-a08591619477","Type":"ContainerStarted","Data":"1b06f1d396b4fe456e4aadd02515e20bfda002d2e4ca43cc9ecc7b64182f7fbf"} Apr 16 19:54:16.325649 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.325590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" event={"ID":"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8","Type":"ContainerStarted","Data":"19f59db22313e172d8dc76c715c59704e6df05f8a93b3e5f7cbc113434f0ef98"} Apr 16 19:54:16.330368 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.328656 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fphk8" event={"ID":"1b5d2585-0759-49e0-8726-9b1f8902ebcf","Type":"ContainerStarted","Data":"5990287f175c150811e555a76138f3f0de2725b32e6b4b4bec0dc30b9532950c"} Apr 16 19:54:16.332060 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.332028 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pxv56" event={"ID":"60894bbb-9d97-4deb-b1de-d69609701101","Type":"ContainerStarted","Data":"d8555fc8203609ec47dc14c314d2880ab96980e7c88dba312031ae97e82ca488"} Apr 16 19:54:16.335706 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.335652 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"a0821d58a092cb4afc9cb195df8c002c1ec2bf7be0625ab5b7a173f7bc32f519"} Apr 16 19:54:16.335838 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.335715 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"18c93d529ec33540b9b3fafbb396c184978fad2277644dc2c99d888ec017d2b7"} Apr 16 19:54:16.335838 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.335730 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"ef67577f11c7d6b969fb65bfdc6b47fd710d735d02c8b43ef6073f1915cce981"} Apr 16 19:54:16.335838 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.335753 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"d45dd38e25f48618c3f3f61e3c517a52b31383d6f5ba859c7cf62472657011d2"} Apr 16 19:54:16.335838 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.335768 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"38393866cb82135dc620383ae99e885b491147a91b696731d460f728b10f7b67"} Apr 16 19:54:16.335838 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.335812 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"abfc966872b22b4a727e51cd7bb7d4247a5888eba427b7057057ab0f309b9335"} Apr 16 19:54:16.337504 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.337475 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9d926bb-1dbb-44e0-981e-4bc43df8b1e0" containerID="f0f34961a902edbf8256f4f597f61c96124bb1dc27bae49331976c6237b2f321" exitCode=0 Apr 16 19:54:16.337662 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.337570 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2987n" event={"ID":"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0","Type":"ContainerDied","Data":"f0f34961a902edbf8256f4f597f61c96124bb1dc27bae49331976c6237b2f321"} Apr 16 19:54:16.339148 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.339115 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qcjrs" event={"ID":"89ed11af-8c8e-432e-9b8a-3696d6697184","Type":"ContainerStarted","Data":"ada80479774b8b3ea13c3438baf89fe20f85419cb10a9d4040b9f19165f0a21f"} Apr 16 19:54:16.341246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.340817 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-km4f6" podStartSLOduration=3.442502586 podStartE2EDuration="20.340799528s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.731711941 +0000 UTC m=+3.146286772" lastFinishedPulling="2026-04-16 19:54:15.630008885 +0000 UTC m=+20.044583714" observedRunningTime="2026-04-16 19:54:16.340187265 +0000 UTC m=+20.754762104" watchObservedRunningTime="2026-04-16 19:54:16.340799528 +0000 UTC m=+20.755374381" Apr 16 19:54:16.341246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.341221 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tg5c9" event={"ID":"448cba83-b62d-4e46-b69b-e948817d0ec5","Type":"ContainerStarted","Data":"2095c58ab5793d4bba8b65927144841dea4cfd2d2b87e19698bfd3626582ede7"} Apr 16 19:54:16.341489 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.341254 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tg5c9" event={"ID":"448cba83-b62d-4e46-b69b-e948817d0ec5","Type":"ContainerStarted","Data":"d59e2ed689a6646ee3e661e126aaa8a23b846e46e0a941eb3533383b5c0929a2"} Apr 16 19:54:16.393131 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.393065 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pxv56" podStartSLOduration=3.502101014 podStartE2EDuration="20.393045865s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.73450502 +0000 UTC m=+3.149079849" lastFinishedPulling="2026-04-16 19:54:15.625449861 +0000 UTC m=+20.040024700" observedRunningTime="2026-04-16 19:54:16.381282427 +0000 UTC m=+20.795857275" watchObservedRunningTime="2026-04-16 19:54:16.393045865 +0000 UTC m=+20.807620714" Apr 16 19:54:16.393326 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.393201 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fphk8" podStartSLOduration=8.094557847 podStartE2EDuration="20.39319347s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.724994167 +0000 UTC m=+3.139568993" lastFinishedPulling="2026-04-16 19:54:11.023629778 +0000 UTC m=+15.438204616" observedRunningTime="2026-04-16 19:54:16.392946567 +0000 UTC m=+20.807521439" watchObservedRunningTime="2026-04-16 19:54:16.39319347 +0000 UTC m=+20.807768318" Apr 16 19:54:16.407403 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.407348 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qcjrs" podStartSLOduration=3.51486415 podStartE2EDuration="20.407330187s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.730726541 +0000 UTC m=+3.145301368" lastFinishedPulling="2026-04-16 19:54:15.62319258 +0000 UTC m=+20.037767405" observedRunningTime="2026-04-16 19:54:16.406705732 +0000 UTC m=+20.821280591" watchObservedRunningTime="2026-04-16 19:54:16.407330187 +0000 UTC m=+20.821905036" Apr 16 19:54:16.776013 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:16.775804 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:17.075879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:17.075706 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:16.776007277Z","UUID":"e4301708-8b4b-4fc6-aa8a-f405122a4e33","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:17.079255 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:17.079227 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:17.079428 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:17.079265 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:17.344676 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:17.344591 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b4wgn" event={"ID":"5bed9951-49d1-4612-b89d-05332f7e56e2","Type":"ContainerStarted","Data":"4f9a835fbd7ab15d96ec962d4f0b2417e8c85bace666e3a992c8b584d2fecccd"} Apr 16 19:54:17.346460 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:17.346389 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" event={"ID":"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8","Type":"ContainerStarted","Data":"b17580737136f1e6be82be5f84610a2e810a736de64a36da2fe4071cea5a553f"} Apr 16 19:54:17.356068 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:17.356016 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tg5c9" podStartSLOduration=6.355994332 podStartE2EDuration="6.355994332s" podCreationTimestamp="2026-04-16 19:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:16.422022033 +0000 UTC m=+20.836596881" watchObservedRunningTime="2026-04-16 19:54:17.355994332 +0000 UTC m=+21.770569182" Apr 16 19:54:17.356616 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:17.356562 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-b4wgn" podStartSLOduration=4.460518533 podStartE2EDuration="21.356548052s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.727196132 +0000 UTC m=+3.141770958" lastFinishedPulling="2026-04-16 19:54:15.623225645 +0000 UTC m=+20.037800477" observedRunningTime="2026-04-16 19:54:17.356260302 +0000 UTC m=+21.770835151" watchObservedRunningTime="2026-04-16 19:54:17.356548052 +0000 UTC m=+21.771122898" Apr 16 19:54:18.178926 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:18.178827 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:18.178926 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:18.178850 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:18.179713 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:18.178964 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:18.179713 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:18.178993 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:18.179713 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:18.179098 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:18.179713 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:18.179172 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:18.351030 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:18.350990 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" event={"ID":"e76bbd13-39a5-4247-a1ea-1de5bd9d98d8","Type":"ContainerStarted","Data":"914f950d036faa14f552a2b599ad2f3de5c911a591be541e04cc260585d66323"} Apr 16 19:54:18.365693 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:18.365628 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zdcwd" podStartSLOduration=3.397857438 podStartE2EDuration="22.365609903s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.728184133 +0000 UTC m=+3.142758963" lastFinishedPulling="2026-04-16 19:54:17.695936598 +0000 UTC m=+22.110511428" observedRunningTime="2026-04-16 19:54:18.365111438 +0000 UTC m=+22.779686308" watchObservedRunningTime="2026-04-16 19:54:18.365609903 +0000 UTC m=+22.780184749" Apr 16 19:54:18.986690 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:18.986651 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:54:18.987408 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:18.987385 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:54:19.356245 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:19.356160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"13a4f9fbebffd6bae33c4e672c2c2866814ca3ed3de224248593c3ba41987c5a"} Apr 16 19:54:20.179287 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:20.179250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:20.179468 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:20.179250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:20.179468 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:20.179362 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:20.179468 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:20.179251 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:20.179468 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:20.179451 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:20.179690 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:20.179571 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:21.362922 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:21.362709 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" event={"ID":"c61ab98e-2fe3-48e1-b144-0d44e1856354","Type":"ContainerStarted","Data":"7564430a46a56b92c03ba92201a4e53ed78e429c78230a59bf13114046e2893e"} Apr 16 19:54:21.363664 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:21.362936 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:54:21.363664 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:21.363035 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:54:21.364511 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:21.364484 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9d926bb-1dbb-44e0-981e-4bc43df8b1e0" containerID="db3704c8d451e293e06b89dbca9c15cf3546bd87fca3beb828d316fcef445d36" exitCode=0 Apr 16 19:54:21.364628 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:21.364525 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2987n" event={"ID":"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0","Type":"ContainerDied","Data":"db3704c8d451e293e06b89dbca9c15cf3546bd87fca3beb828d316fcef445d36"} Apr 16 19:54:21.379213 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:21.379184 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:54:21.379372 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:21.379255 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:54:21.389750 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:21.389707 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" podStartSLOduration=8.276622615 podStartE2EDuration="25.389694139s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.733268813 +0000 UTC m=+3.147843650" lastFinishedPulling="2026-04-16 19:54:15.846340334 +0000 UTC m=+20.260915174" observedRunningTime="2026-04-16 19:54:21.38951053 +0000 UTC m=+25.804085377" watchObservedRunningTime="2026-04-16 19:54:21.389694139 +0000 UTC m=+25.804268984" Apr 16 19:54:22.178856 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.178829 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:22.178997 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.178859 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:22.178997 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.178826 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:22.178997 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:22.178983 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:22.179184 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:22.179073 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:22.179184 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:22.179158 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:22.368498 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.368420 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9d926bb-1dbb-44e0-981e-4bc43df8b1e0" containerID="196b2c3becb34e2f5302948159421256dfdaac9b3bf7974a76c9bac22b14c6a5" exitCode=0 Apr 16 19:54:22.368498 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.368460 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2987n" event={"ID":"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0","Type":"ContainerDied","Data":"196b2c3becb34e2f5302948159421256dfdaac9b3bf7974a76c9bac22b14c6a5"} Apr 16 19:54:22.368925 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.368723 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:22.576480 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.576299 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:54:22.582770 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.582738 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zkxnw"] Apr 16 19:54:22.582908 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.582883 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:22.583018 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:22.582993 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:22.585373 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.585347 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d724r"] Apr 16 19:54:22.585502 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.585475 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:22.585610 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:22.585572 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:22.598746 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.598710 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8jmq5"] Apr 16 19:54:22.598887 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:22.598868 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:22.598982 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:22.598961 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:23.372623 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:23.372557 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9d926bb-1dbb-44e0-981e-4bc43df8b1e0" containerID="e42ce3d63cb152848874f1ed800dad7bcc80f8baa5912eef00c9e70cfbe4953d" exitCode=0 Apr 16 19:54:23.372623 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:23.372617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2987n" event={"ID":"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0","Type":"ContainerDied","Data":"e42ce3d63cb152848874f1ed800dad7bcc80f8baa5912eef00c9e70cfbe4953d"} Apr 16 19:54:24.179115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:24.179031 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:24.179115 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:24.179073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:24.179331 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:24.179157 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:24.179331 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:24.179250 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:24.179331 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:24.179314 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:24.179477 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:24.179390 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:26.180207 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:26.180169 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:26.181039 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:26.180283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:26.181039 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:26.180318 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:26.181039 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:26.180353 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:26.181039 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:26.180395 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:26.181039 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:26.180444 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:26.330838 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:26.330794 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:54:26.331008 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:26.330957 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:26.331471 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:26.331452 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qcjrs" Apr 16 19:54:28.179557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.179518 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:28.180059 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.179518 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:28.180059 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.179518 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:28.180163 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.180136 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zkxnw" podUID="b7a86d58-955a-4af8-9ae0-c6e786f43b28" Apr 16 19:54:28.181185 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.180716 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8jmq5" podUID="10356841-c032-4d12-8328-dc3aeb909c86" Apr 16 19:54:28.181392 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.181357 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d724r" podUID="a0b5f3c5-7848-4283-b7e8-31a5e5f79888" Apr 16 19:54:28.428090 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.428061 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-205.ec2.internal" event="NodeReady" Apr 16 19:54:28.428267 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.428220 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:28.465810 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.465708 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl"] Apr 16 19:54:28.498732 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.498701 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8"] Apr 16 19:54:28.498881 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.498857 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:28.501389 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.501360 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.501547 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.501526 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 19:54:28.501678 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.501657 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.501813 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.501659 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-mh28h\"" Apr 16 19:54:28.515249 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.515126 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb"] Apr 16 19:54:28.516192 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.515415 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8" Apr 16 19:54:28.518834 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.518769 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-ql7tm\"" Apr 16 19:54:28.519437 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.519158 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.519437 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.519370 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.530474 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.530436 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq"] Apr 16 19:54:28.530628 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.530610 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.533291 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.533262 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.533436 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.533342 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.533550 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.533458 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 19:54:28.533626 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.533588 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 19:54:28.534383 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.534332 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-6h8gm\"" Apr 16 19:54:28.548809 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.548768 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf"] Apr 16 19:54:28.549116 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.549092 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.551231 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.551207 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.551495 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.551476 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.551495 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.551474 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 19:54:28.551657 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.551634 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-szk5z\"" Apr 16 19:54:28.551755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.551736 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 19:54:28.566660 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.566633 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k4s7x"] Apr 16 19:54:28.566818 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.566783 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.568897 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.568875 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 19:54:28.569032 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.568887 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.569032 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.568915 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.569136 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.569040 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 19:54:28.569136 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.569060 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nmmmf\"" Apr 16 19:54:28.571366 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.571339 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bx9n\" (UniqueName: \"kubernetes.io/projected/04afe8e9-f130-46a2-86e8-733e2a491106-kube-api-access-7bx9n\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:28.571497 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.571448 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:28.585596 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.585543 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67db9885f4-znz5z"] Apr 16 19:54:28.585766 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.585741 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.587617 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.587592 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 19:54:28.587617 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.587601 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 19:54:28.588011 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.587918 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.588011 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.587983 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.588170 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.587986 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-89bzp\"" Apr 16 19:54:28.593665 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.593638 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 19:54:28.603989 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.603916 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-77c5ff64cd-kkjnk"] Apr 16 19:54:28.604156 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.604073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.606244 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.606220 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:54:28.606601 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.606298 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:54:28.606719 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.606349 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:54:28.606840 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.606364 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lq4lt\"" Apr 16 19:54:28.612344 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.612316 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:54:28.619153 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.619125 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-cjrs6"] Apr 16 19:54:28.619292 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.619279 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.621235 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.621188 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 19:54:28.621235 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.621231 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-s6qtx\"" Apr 16 19:54:28.621429 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.621262 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 19:54:28.621753 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.621730 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 19:54:28.621753 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.621751 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.622497 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.622474 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 19:54:28.622612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.622514 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.638343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.638315 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb"] Apr 16 19:54:28.638343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.638348 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl"] Apr 16 19:54:28.638550 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.638361 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8"] Apr 16 19:54:28.638550 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.638375 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc"] Apr 16 19:54:28.638550 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.638494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.642948 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.642802 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.642948 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.642853 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.643266 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.643125 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-p6vhl\"" Apr 16 19:54:28.643266 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.643186 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 19:54:28.643556 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.643539 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 19:54:28.648111 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.648084 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 19:54:28.658538 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.658500 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf"] Apr 16 19:54:28.658538 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.658535 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k4s7x"] Apr 16 19:54:28.658770 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.658549 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq"] Apr 16 19:54:28.658770 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.658564 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-77c5ff64cd-kkjnk"] Apr 16 19:54:28.658770 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.658598 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq"] Apr 16 19:54:28.658770 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.658690 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc" Apr 16 19:54:28.661201 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.661154 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-rfz4j\"" Apr 16 19:54:28.661326 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.661204 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.661326 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.661244 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.672123 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bx9n\" (UniqueName: \"kubernetes.io/projected/04afe8e9-f130-46a2-86e8-733e2a491106-kube-api-access-7bx9n\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:28.672281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672135 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx5ch\" (UniqueName: \"kubernetes.io/projected/95f11675-707c-4777-8e40-73a4b72aadc9-kube-api-access-qx5ch\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.672281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672167 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blr2p\" (UniqueName: \"kubernetes.io/projected/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-kube-api-access-blr2p\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.672281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672198 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f11675-707c-4777-8e40-73a4b72aadc9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.672281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672222 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q5lq\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-kube-api-access-2q5lq\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.672281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672247 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c7b46a8f-9a8f-42e0-971b-334f467cc56f-snapshots\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.672492 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.672492 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672333 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f11675-707c-4777-8e40-73a4b72aadc9-config\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.672492 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-image-registry-private-configuration\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.672492 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672414 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/015a1a89-29e1-449f-b569-19b3cce360b4-ca-trust-extracted\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.672492 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672470 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-installation-pull-secrets\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.672755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672498 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.672755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672529 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b46a8f-9a8f-42e0-971b-334f467cc56f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.672755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672545 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b46a8f-9a8f-42e0-971b-334f467cc56f-serving-cert\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.672755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkf9\" (UniqueName: \"kubernetes.io/projected/aafa3559-b7eb-410f-91aa-abf590bd5c4a-kube-api-access-2pkf9\") pod \"volume-data-source-validator-7c6cbb6c87-hj2q8\" (UID: \"aafa3559-b7eb-410f-91aa-abf590bd5c4a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8" Apr 16 19:54:28.672755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.672755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672644 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-registry-certificates\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.672755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-bound-sa-token\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.672755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w5s8\" (UniqueName: \"kubernetes.io/projected/c7b46a8f-9a8f-42e0-971b-334f467cc56f-kube-api-access-4w5s8\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvnz\" (UniqueName: \"kubernetes.io/projected/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-kube-api-access-4jvnz\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.672804 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672863 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.672894 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls podName:04afe8e9-f130-46a2-86e8-733e2a491106 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.172878707 +0000 UTC m=+33.587453534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqzdl" (UID: "04afe8e9-f130-46a2-86e8-733e2a491106") : secret "samples-operator-tls" not found Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.672985 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7b46a8f-9a8f-42e0-971b-334f467cc56f-tmp\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.673012 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.673057 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-trusted-ca\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.673154 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.673082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b46a8f-9a8f-42e0-971b-334f467cc56f-service-ca-bundle\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.680609 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.680557 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc"] Apr 16 19:54:28.680609 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.680614 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq"] Apr 16 19:54:28.680825 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.680629 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-cjrs6"] Apr 16 19:54:28.680825 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.680643 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67db9885f4-znz5z"] Apr 16 19:54:28.680825 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.680656 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k262c"] Apr 16 19:54:28.680825 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.680731 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:28.682779 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.682564 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 19:54:28.682939 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.682654 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 19:54:28.682939 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.682687 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-x7lzn\"" Apr 16 19:54:28.686103 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.686061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bx9n\" (UniqueName: \"kubernetes.io/projected/04afe8e9-f130-46a2-86e8-733e2a491106-kube-api-access-7bx9n\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:28.703265 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.703235 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w77xr"] Apr 16 19:54:28.703437 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.703387 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:28.706859 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.706834 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:28.707002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.706936 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfkbf\"" Apr 16 19:54:28.707277 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.707240 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:28.707421 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.707387 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:28.717672 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.717527 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k262c"] Apr 16 19:54:28.717672 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.717555 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w77xr"] Apr 16 19:54:28.717672 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.717666 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.721026 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.720998 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-668h7\"" Apr 16 19:54:28.721178 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.721152 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:28.721294 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.721210 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:28.773724 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.773669 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.773724 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.773721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b46a8f-9a8f-42e0-971b-334f467cc56f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.773988 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.773752 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkl7l\" (UniqueName: \"kubernetes.io/projected/8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4-kube-api-access-nkl7l\") pod \"network-check-source-8894fc9bd-wjmcc\" (UID: \"8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc" Apr 16 19:54:28.773988 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.773798 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.773988 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.773820 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:28.773988 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.773825 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:28.773988 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.773923 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls podName:e85eca51-eae1-4ef0-b008-1a1e1b796b4c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.273897182 +0000 UTC m=+33.688472020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-49lhf" (UID: "e85eca51-eae1-4ef0-b008-1a1e1b796b4c") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:28.773988 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.773958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7b46a8f-9a8f-42e0-971b-334f467cc56f-tmp\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.773994 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.774006 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67db9885f4-znz5z: secret "image-registry-tls" not found Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.774047 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls podName:015a1a89-29e1-449f-b569-19b3cce360b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.274033456 +0000 UTC m=+33.688608285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls") pod "image-registry-67db9885f4-znz5z" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4") : secret "image-registry-tls" not found Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkf9\" (UniqueName: \"kubernetes.io/projected/aafa3559-b7eb-410f-91aa-abf590bd5c4a-kube-api-access-2pkf9\") pod \"volume-data-source-validator-7c6cbb6c87-hj2q8\" (UID: \"aafa3559-b7eb-410f-91aa-abf590bd5c4a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8" Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774149 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/015a1a89-29e1-449f-b569-19b3cce360b4-ca-trust-extracted\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.774264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-default-certificate\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774320 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f11675-707c-4777-8e40-73a4b72aadc9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774346 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4pns\" (UniqueName: \"kubernetes.io/projected/ce138de6-668e-4e27-b7d0-579a176ea2f2-kube-api-access-z4pns\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f11675-707c-4777-8e40-73a4b72aadc9-config\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774418 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-image-registry-private-configuration\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774467 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-installation-pull-secrets\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774520 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbkg\" (UniqueName: \"kubernetes.io/projected/a0b4741a-f02e-48e4-a491-d2ae897b44dd-kube-api-access-nxbkg\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b46a8f-9a8f-42e0-971b-334f467cc56f-serving-cert\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774564 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/015a1a89-29e1-449f-b569-19b3cce360b4-ca-trust-extracted\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.774687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-bound-sa-token\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774708 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-registry-certificates\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvnz\" (UniqueName: \"kubernetes.io/projected/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-kube-api-access-4jvnz\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774764 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce138de6-668e-4e27-b7d0-579a176ea2f2-config\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774804 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w5s8\" (UniqueName: \"kubernetes.io/projected/c7b46a8f-9a8f-42e0-971b-334f467cc56f-kube-api-access-4w5s8\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774833 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774880 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774909 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-stats-auth\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.774979 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-trusted-ca\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.775018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2q5lq\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-kube-api-access-2q5lq\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.775037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95f11675-707c-4777-8e40-73a4b72aadc9-config\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.775151 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.775077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blr2p\" (UniqueName: \"kubernetes.io/projected/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-kube-api-access-blr2p\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.775837 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.775383 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-registry-certificates\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.775837 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.775780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx5ch\" (UniqueName: \"kubernetes.io/projected/95f11675-707c-4777-8e40-73a4b72aadc9-kube-api-access-qx5ch\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.775837 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.775828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c7b46a8f-9a8f-42e0-971b-334f467cc56f-snapshots\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.775986 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.775860 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce138de6-668e-4e27-b7d0-579a176ea2f2-serving-cert\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.776356 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.776248 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2fgl\" (UniqueName: \"kubernetes.io/projected/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-kube-api-access-g2fgl\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:28.776356 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.776294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b46a8f-9a8f-42e0-971b-334f467cc56f-service-ca-bundle\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.776530 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.776430 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-trusted-ca\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.776530 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.776438 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce138de6-668e-4e27-b7d0-579a176ea2f2-trusted-ca\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.777023 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.776958 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.778171 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.778108 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f11675-707c-4777-8e40-73a4b72aadc9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.778171 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.778147 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-installation-pull-secrets\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.778386 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.778366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-image-registry-private-configuration\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.778976 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.778950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7b46a8f-9a8f-42e0-971b-334f467cc56f-tmp\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.779296 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.779272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c7b46a8f-9a8f-42e0-971b-334f467cc56f-snapshots\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.779452 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.779411 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b46a8f-9a8f-42e0-971b-334f467cc56f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.779880 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.779853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7b46a8f-9a8f-42e0-971b-334f467cc56f-service-ca-bundle\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.780951 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.780932 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7b46a8f-9a8f-42e0-971b-334f467cc56f-serving-cert\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.786695 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.786649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvnz\" (UniqueName: \"kubernetes.io/projected/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-kube-api-access-4jvnz\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.786863 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.786828 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-bound-sa-token\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.787615 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.787508 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx5ch\" (UniqueName: \"kubernetes.io/projected/95f11675-707c-4777-8e40-73a4b72aadc9-kube-api-access-qx5ch\") pod \"service-ca-operator-d6fc45fc5-vh7nb\" (UID: \"95f11675-707c-4777-8e40-73a4b72aadc9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.787837 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.787784 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q5lq\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-kube-api-access-2q5lq\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:28.788687 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.787953 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a0704cd-b28c-4d5b-9e72-79fcd84527b4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jxvqq\" (UID: \"9a0704cd-b28c-4d5b-9e72-79fcd84527b4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.788992 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.788970 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkf9\" (UniqueName: \"kubernetes.io/projected/aafa3559-b7eb-410f-91aa-abf590bd5c4a-kube-api-access-2pkf9\") pod \"volume-data-source-validator-7c6cbb6c87-hj2q8\" (UID: \"aafa3559-b7eb-410f-91aa-abf590bd5c4a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8" Apr 16 19:54:28.789136 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.789089 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w5s8\" (UniqueName: \"kubernetes.io/projected/c7b46a8f-9a8f-42e0-971b-334f467cc56f-kube-api-access-4w5s8\") pod \"insights-operator-585dfdc468-k4s7x\" (UID: \"c7b46a8f-9a8f-42e0-971b-334f467cc56f\") " pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.789231 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.789211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blr2p\" (UniqueName: \"kubernetes.io/projected/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-kube-api-access-blr2p\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:28.828343 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.828299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8" Apr 16 19:54:28.844351 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.844310 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" Apr 16 19:54:28.860178 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.860143 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" Apr 16 19:54:28.877168 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.877128 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce138de6-668e-4e27-b7d0-579a176ea2f2-config\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.877358 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.877184 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.877358 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.877209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-stats-auth\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.877358 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.877242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce138de6-668e-4e27-b7d0-579a176ea2f2-serving-cert\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.877358 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.877302 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2fgl\" (UniqueName: \"kubernetes.io/projected/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-kube-api-access-g2fgl\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:28.877358 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.877334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce138de6-668e-4e27-b7d0-579a176ea2f2-trusted-ca\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.877529 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.877386 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.37736194 +0000 UTC m=+33.791936785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:28.878773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.878066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkl7l\" (UniqueName: \"kubernetes.io/projected/8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4-kube-api-access-nkl7l\") pod \"network-check-source-8894fc9bd-wjmcc\" (UID: \"8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc" Apr 16 19:54:28.878773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.878166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:28.878773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.878369 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce138de6-668e-4e27-b7d0-579a176ea2f2-config\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.878773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.878206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.878773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.878547 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:28.878773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.878713 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce138de6-668e-4e27-b7d0-579a176ea2f2-trusted-ca\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.879158 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.878624 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.879158 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.878898 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:28.879158 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.878931 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:28.879158 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.879052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:28.879354 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.879184 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert podName:8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.379162999 +0000 UTC m=+33.793737842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rk7vq" (UID: "8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6") : secret "networking-console-plugin-cert" not found Apr 16 19:54:28.879354 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.879226 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.379210192 +0000 UTC m=+33.793785037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : secret "router-metrics-certs-default" not found Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.879223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chglt\" (UniqueName: \"kubernetes.io/projected/28edf32a-262a-4c91-89da-2c452e8d1152-kube-api-access-chglt\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.879809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-default-certificate\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.879841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.879875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28edf32a-262a-4c91-89da-2c452e8d1152-tmp-dir\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.879916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4pns\" (UniqueName: \"kubernetes.io/projected/ce138de6-668e-4e27-b7d0-579a176ea2f2-kube-api-access-z4pns\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.879971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbkg\" (UniqueName: \"kubernetes.io/projected/a0b4741a-f02e-48e4-a491-d2ae897b44dd-kube-api-access-nxbkg\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.880034 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28edf32a-262a-4c91-89da-2c452e8d1152-config-volume\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.880594 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce138de6-668e-4e27-b7d0-579a176ea2f2-serving-cert\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.880717 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.880780 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert podName:994f79b6-31dc-4b4f-8c42-e6e60bee90cf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.380763992 +0000 UTC m=+33.795338818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert") pod "ingress-canary-k262c" (UID: "994f79b6-31dc-4b4f-8c42-e6e60bee90cf") : secret "canary-serving-cert" not found Apr 16 19:54:28.881677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.880795 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-stats-auth\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.882880 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.882854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-default-certificate\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.887951 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.887927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkl7l\" (UniqueName: \"kubernetes.io/projected/8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4-kube-api-access-nkl7l\") pod \"network-check-source-8894fc9bd-wjmcc\" (UID: \"8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc" Apr 16 19:54:28.888454 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.888375 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2fgl\" (UniqueName: \"kubernetes.io/projected/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-kube-api-access-g2fgl\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:28.889191 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.889172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4pns\" (UniqueName: \"kubernetes.io/projected/ce138de6-668e-4e27-b7d0-579a176ea2f2-kube-api-access-z4pns\") pod \"console-operator-9d4b6777b-cjrs6\" (UID: \"ce138de6-668e-4e27-b7d0-579a176ea2f2\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.889916 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.889895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbkg\" (UniqueName: \"kubernetes.io/projected/a0b4741a-f02e-48e4-a491-d2ae897b44dd-kube-api-access-nxbkg\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:28.897314 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.897287 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-k4s7x" Apr 16 19:54:28.949482 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.949441 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:28.970264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.970200 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc" Apr 16 19:54:28.981181 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.981149 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28edf32a-262a-4c91-89da-2c452e8d1152-config-volume\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.981824 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.981284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.981824 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.981348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chglt\" (UniqueName: \"kubernetes.io/projected/28edf32a-262a-4c91-89da-2c452e8d1152-kube-api-access-chglt\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.981824 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.981401 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28edf32a-262a-4c91-89da-2c452e8d1152-tmp-dir\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.981824 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.981710 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28edf32a-262a-4c91-89da-2c452e8d1152-tmp-dir\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.981824 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.981800 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28edf32a-262a-4c91-89da-2c452e8d1152-config-volume\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:28.981824 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.981819 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:28.982018 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:28.981883 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls podName:28edf32a-262a-4c91-89da-2c452e8d1152 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.481866851 +0000 UTC m=+33.896441678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls") pod "dns-default-w77xr" (UID: "28edf32a-262a-4c91-89da-2c452e8d1152") : secret "dns-default-metrics-tls" not found Apr 16 19:54:28.989898 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:28.989873 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chglt\" (UniqueName: \"kubernetes.io/projected/28edf32a-262a-4c91-89da-2c452e8d1152-kube-api-access-chglt\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:29.186220 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.185507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:29.186220 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.185750 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:29.186220 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.185817 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls podName:04afe8e9-f130-46a2-86e8-733e2a491106 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.185796574 +0000 UTC m=+34.600371420 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqzdl" (UID: "04afe8e9-f130-46a2-86e8-733e2a491106") : secret "samples-operator-tls" not found Apr 16 19:54:29.224803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.224723 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc"] Apr 16 19:54:29.226997 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.226970 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8"] Apr 16 19:54:29.229482 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:54:29.229447 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bf6ef5e_33ca_46d3_84d9_c703a6a9dea4.slice/crio-a3ed6826dd2e2c19da0629a23f6c5dc3218c83c56afa001ecf8154afefd8fcf3 WatchSource:0}: Error finding container a3ed6826dd2e2c19da0629a23f6c5dc3218c83c56afa001ecf8154afefd8fcf3: Status 404 returned error can't find the container with id a3ed6826dd2e2c19da0629a23f6c5dc3218c83c56afa001ecf8154afefd8fcf3 Apr 16 19:54:29.231592 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:54:29.231553 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaafa3559_b7eb_410f_91aa_abf590bd5c4a.slice/crio-931f4c8209e865584b4c6fa62de627694da5c44bf4749a7499aaf268170312f5 WatchSource:0}: Error finding container 931f4c8209e865584b4c6fa62de627694da5c44bf4749a7499aaf268170312f5: Status 404 returned error can't find the container with id 931f4c8209e865584b4c6fa62de627694da5c44bf4749a7499aaf268170312f5 Apr 16 19:54:29.237277 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.237256 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k4s7x"] Apr 16 19:54:29.244760 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.244731 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb"] Apr 16 19:54:29.246387 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:54:29.246361 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b46a8f_9a8f_42e0_971b_334f467cc56f.slice/crio-c19586728ce45f5bd409ae6ea04e14a672acd3702d9986f9ea3e26d09260c6e1 WatchSource:0}: Error finding container c19586728ce45f5bd409ae6ea04e14a672acd3702d9986f9ea3e26d09260c6e1: Status 404 returned error can't find the container with id c19586728ce45f5bd409ae6ea04e14a672acd3702d9986f9ea3e26d09260c6e1 Apr 16 19:54:29.246980 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:54:29.246957 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95f11675_707c_4777_8e40_73a4b72aadc9.slice/crio-1e92bb272df190976b493459cd82ba0fd8cf1b1ffd1be561d719a5716ea54a81 WatchSource:0}: Error finding container 1e92bb272df190976b493459cd82ba0fd8cf1b1ffd1be561d719a5716ea54a81: Status 404 returned error can't find the container with id 1e92bb272df190976b493459cd82ba0fd8cf1b1ffd1be561d719a5716ea54a81 Apr 16 19:54:29.259291 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.259251 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq"] Apr 16 19:54:29.264281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.261342 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-cjrs6"] Apr 16 19:54:29.264281 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:54:29.262977 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0704cd_b28c_4d5b_9e72_79fcd84527b4.slice/crio-cbbac075877882b3c0f3281004563bf9871a13a0029ce2ff82ab379e6cc99707 WatchSource:0}: Error finding container cbbac075877882b3c0f3281004563bf9871a13a0029ce2ff82ab379e6cc99707: Status 404 returned error can't find the container with id cbbac075877882b3c0f3281004563bf9871a13a0029ce2ff82ab379e6cc99707 Apr 16 19:54:29.265957 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:54:29.265932 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce138de6_668e_4e27_b7d0_579a176ea2f2.slice/crio-a97254746e9346e1394e387fd3465f5756ee8fe09628ef15237e111faa6dfd09 WatchSource:0}: Error finding container a97254746e9346e1394e387fd3465f5756ee8fe09628ef15237e111faa6dfd09: Status 404 returned error can't find the container with id a97254746e9346e1394e387fd3465f5756ee8fe09628ef15237e111faa6dfd09 Apr 16 19:54:29.286306 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.286271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:29.286306 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.286315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:29.286549 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.286443 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:29.286549 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.286495 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:29.286549 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.286514 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls podName:e85eca51-eae1-4ef0-b008-1a1e1b796b4c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.286492185 +0000 UTC m=+34.701067025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-49lhf" (UID: "e85eca51-eae1-4ef0-b008-1a1e1b796b4c") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:29.286549 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.286513 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67db9885f4-znz5z: secret "image-registry-tls" not found Apr 16 19:54:29.286724 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.286553 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls podName:015a1a89-29e1-449f-b569-19b3cce360b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.286544555 +0000 UTC m=+34.701119382 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls") pod "image-registry-67db9885f4-znz5z" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4") : secret "image-registry-tls" not found Apr 16 19:54:29.386299 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.386263 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8" event={"ID":"aafa3559-b7eb-410f-91aa-abf590bd5c4a","Type":"ContainerStarted","Data":"931f4c8209e865584b4c6fa62de627694da5c44bf4749a7499aaf268170312f5"} Apr 16 19:54:29.386882 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.386853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:29.387017 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.386912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:29.387017 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.386964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:29.387017 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.386985 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:29.387156 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.387043 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert podName:8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.387024478 +0000 UTC m=+34.801599308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rk7vq" (UID: "8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6") : secret "networking-console-plugin-cert" not found Apr 16 19:54:29.387156 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.387064 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:29.387156 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.387076 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:29.387156 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.387087 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:29.387156 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.387128 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.387111476 +0000 UTC m=+34.801686305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : secret "router-metrics-certs-default" not found Apr 16 19:54:29.387156 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.387147 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert podName:994f79b6-31dc-4b4f-8c42-e6e60bee90cf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.387138038 +0000 UTC m=+34.801712866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert") pod "ingress-canary-k262c" (UID: "994f79b6-31dc-4b4f-8c42-e6e60bee90cf") : secret "canary-serving-cert" not found Apr 16 19:54:29.387373 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.387190 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.387178902 +0000 UTC m=+34.801753729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:29.387422 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.387365 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc" event={"ID":"8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4","Type":"ContainerStarted","Data":"a3ed6826dd2e2c19da0629a23f6c5dc3218c83c56afa001ecf8154afefd8fcf3"} Apr 16 19:54:29.388385 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.388365 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" event={"ID":"ce138de6-668e-4e27-b7d0-579a176ea2f2","Type":"ContainerStarted","Data":"a97254746e9346e1394e387fd3465f5756ee8fe09628ef15237e111faa6dfd09"} Apr 16 19:54:29.389342 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.389318 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k4s7x" event={"ID":"c7b46a8f-9a8f-42e0-971b-334f467cc56f","Type":"ContainerStarted","Data":"c19586728ce45f5bd409ae6ea04e14a672acd3702d9986f9ea3e26d09260c6e1"} Apr 16 19:54:29.390188 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.390170 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" event={"ID":"95f11675-707c-4777-8e40-73a4b72aadc9","Type":"ContainerStarted","Data":"1e92bb272df190976b493459cd82ba0fd8cf1b1ffd1be561d719a5716ea54a81"} Apr 16 19:54:29.391083 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.391065 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" event={"ID":"9a0704cd-b28c-4d5b-9e72-79fcd84527b4","Type":"ContainerStarted","Data":"cbbac075877882b3c0f3281004563bf9871a13a0029ce2ff82ab379e6cc99707"} Apr 16 19:54:29.488429 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.488350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:29.488603 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.488514 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:29.488645 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.488610 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls podName:28edf32a-262a-4c91-89da-2c452e8d1152 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.488567293 +0000 UTC m=+34.903142132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls") pod "dns-default-w77xr" (UID: "28edf32a-262a-4c91-89da-2c452e8d1152") : secret "dns-default-metrics-tls" not found Apr 16 19:54:29.791279 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.790959 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:29.791279 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.791272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:29.791514 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.791210 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:29.791514 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.791374 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret podName:b7a86d58-955a-4af8-9ae0-c6e786f43b28 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:01.79135292 +0000 UTC m=+66.205927759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret") pod "global-pull-secret-syncer-zkxnw" (UID: "b7a86d58-955a-4af8-9ae0-c6e786f43b28") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:29.791653 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.791632 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:29.791709 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:29.791694 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs podName:10356841-c032-4d12-8328-dc3aeb909c86 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:01.791678225 +0000 UTC m=+66.206253063 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs") pod "network-metrics-daemon-8jmq5" (UID: "10356841-c032-4d12-8328-dc3aeb909c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:29.893042 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.892350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:29.898866 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:29.898803 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sxx9\" (UniqueName: \"kubernetes.io/projected/a0b5f3c5-7848-4283-b7e8-31a5e5f79888-kube-api-access-4sxx9\") pod \"network-check-target-d724r\" (UID: \"a0b5f3c5-7848-4283-b7e8-31a5e5f79888\") " pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:30.184962 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.184883 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:54:30.185323 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.185303 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:30.186087 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.186011 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:54:30.191639 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.191604 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qj6k2\"" Apr 16 19:54:30.201548 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.191871 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nnp77\"" Apr 16 19:54:30.201548 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.192031 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:30.201548 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.192105 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:54:30.201548 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.195038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:30.201548 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.195142 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:30.201548 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.201424 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls podName:04afe8e9-f130-46a2-86e8-733e2a491106 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.201400203 +0000 UTC m=+36.615975029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqzdl" (UID: "04afe8e9-f130-46a2-86e8-733e2a491106") : secret "samples-operator-tls" not found Apr 16 19:54:30.211557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.211530 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:30.302178 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.301964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:30.302178 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.302026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:30.302895 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.302670 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:30.302895 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.302737 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls podName:e85eca51-eae1-4ef0-b008-1a1e1b796b4c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.302717857 +0000 UTC m=+36.717292689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-49lhf" (UID: "e85eca51-eae1-4ef0-b008-1a1e1b796b4c") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:30.302895 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.302804 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:30.302895 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.302815 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67db9885f4-znz5z: secret "image-registry-tls" not found Apr 16 19:54:30.302895 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.302846 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls podName:015a1a89-29e1-449f-b569-19b3cce360b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.30283611 +0000 UTC m=+36.717410936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls") pod "image-registry-67db9885f4-znz5z" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4") : secret "image-registry-tls" not found Apr 16 19:54:30.400210 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.400165 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9d926bb-1dbb-44e0-981e-4bc43df8b1e0" containerID="f8267856b2634236fb76ec7128f04b5747e9435a3d0de6d2ca89834284235ce7" exitCode=0 Apr 16 19:54:30.400390 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.400247 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2987n" event={"ID":"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0","Type":"ContainerDied","Data":"f8267856b2634236fb76ec7128f04b5747e9435a3d0de6d2ca89834284235ce7"} Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.402944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.403004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.403054 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.403130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.403299 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.403280381 +0000 UTC m=+36.817855220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.403710 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.403784 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.403760755 +0000 UTC m=+36.818335586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : secret "router-metrics-certs-default" not found Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.403845 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.403879 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert podName:8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.403868337 +0000 UTC m=+36.818443170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rk7vq" (UID: "8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6") : secret "networking-console-plugin-cert" not found Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.403947 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:30.404002 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.403976 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert podName:994f79b6-31dc-4b4f-8c42-e6e60bee90cf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.403966319 +0000 UTC m=+36.818541149 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert") pod "ingress-canary-k262c" (UID: "994f79b6-31dc-4b4f-8c42-e6e60bee90cf") : secret "canary-serving-cert" not found Apr 16 19:54:30.412106 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.412039 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d724r"] Apr 16 19:54:30.417736 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:54:30.417701 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b5f3c5_7848_4283_b7e8_31a5e5f79888.slice/crio-548d2247a2928e2caee57e9e92e4575ecd3d49e291770d2f9065039e1aa62c8b WatchSource:0}: Error finding container 548d2247a2928e2caee57e9e92e4575ecd3d49e291770d2f9065039e1aa62c8b: Status 404 returned error can't find the container with id 548d2247a2928e2caee57e9e92e4575ecd3d49e291770d2f9065039e1aa62c8b Apr 16 19:54:30.504215 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:30.504174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:30.506147 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.505477 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:30.506147 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:30.505544 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls podName:28edf32a-262a-4c91-89da-2c452e8d1152 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.50552367 +0000 UTC m=+36.920098510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls") pod "dns-default-w77xr" (UID: "28edf32a-262a-4c91-89da-2c452e8d1152") : secret "dns-default-metrics-tls" not found Apr 16 19:54:31.406696 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:31.405522 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d724r" event={"ID":"a0b5f3c5-7848-4283-b7e8-31a5e5f79888","Type":"ContainerStarted","Data":"548d2247a2928e2caee57e9e92e4575ecd3d49e291770d2f9065039e1aa62c8b"} Apr 16 19:54:31.411881 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:31.411843 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9d926bb-1dbb-44e0-981e-4bc43df8b1e0" containerID="a17f3d904f1d60ac94ab79a3d08f1814929a2ef169cfe121b37dd0712ed94a24" exitCode=0 Apr 16 19:54:31.412049 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:31.411906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2987n" event={"ID":"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0","Type":"ContainerDied","Data":"a17f3d904f1d60ac94ab79a3d08f1814929a2ef169cfe121b37dd0712ed94a24"} Apr 16 19:54:32.222504 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:32.222465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:32.222694 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.222663 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:32.222773 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.222738 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls podName:04afe8e9-f130-46a2-86e8-733e2a491106 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.222721046 +0000 UTC m=+40.637295877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqzdl" (UID: "04afe8e9-f130-46a2-86e8-733e2a491106") : secret "samples-operator-tls" not found Apr 16 19:54:32.323444 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:32.323345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:32.323444 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:32.323406 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:32.323706 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.323597 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:32.323706 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.323615 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67db9885f4-znz5z: secret "image-registry-tls" not found Apr 16 19:54:32.323706 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.323691 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls podName:015a1a89-29e1-449f-b569-19b3cce360b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.32367121 +0000 UTC m=+40.738246051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls") pod "image-registry-67db9885f4-znz5z" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4") : secret "image-registry-tls" not found Apr 16 19:54:32.323866 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.323706 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:32.323866 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.323782 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls podName:e85eca51-eae1-4ef0-b008-1a1e1b796b4c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.32376463 +0000 UTC m=+40.738339462 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-49lhf" (UID: "e85eca51-eae1-4ef0-b008-1a1e1b796b4c") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:32.424700 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:32.424660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:32.425179 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.424869 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.424846663 +0000 UTC m=+40.839421502 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:32.425179 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:32.424917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:32.425179 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:32.424970 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:32.425179 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:32.425007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:32.425179 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.425020 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:32.425179 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.425108 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert podName:8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.425090423 +0000 UTC m=+40.839665263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rk7vq" (UID: "8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6") : secret "networking-console-plugin-cert" not found Apr 16 19:54:32.425179 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.425123 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:32.425179 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.425181 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert podName:994f79b6-31dc-4b4f-8c42-e6e60bee90cf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.425166346 +0000 UTC m=+40.839741191 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert") pod "ingress-canary-k262c" (UID: "994f79b6-31dc-4b4f-8c42-e6e60bee90cf") : secret "canary-serving-cert" not found Apr 16 19:54:32.425625 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.425123 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:32.425625 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.425233 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.425215415 +0000 UTC m=+40.839790242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : secret "router-metrics-certs-default" not found Apr 16 19:54:32.525948 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:32.525898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:32.526150 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.526126 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:32.526229 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:32.526213 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls podName:28edf32a-262a-4c91-89da-2c452e8d1152 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:36.526193821 +0000 UTC m=+40.940768662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls") pod "dns-default-w77xr" (UID: "28edf32a-262a-4c91-89da-2c452e8d1152") : secret "dns-default-metrics-tls" not found Apr 16 19:54:36.263274 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.263238 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:36.263770 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.263394 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:36.263770 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.263444 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls podName:04afe8e9-f130-46a2-86e8-733e2a491106 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.263429955 +0000 UTC m=+48.678004786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqzdl" (UID: "04afe8e9-f130-46a2-86e8-733e2a491106") : secret "samples-operator-tls" not found Apr 16 19:54:36.363960 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.363926 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:36.364120 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.363979 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:36.364175 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.364128 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:36.364212 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.364205 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls podName:e85eca51-eae1-4ef0-b008-1a1e1b796b4c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.364184416 +0000 UTC m=+48.778759249 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-49lhf" (UID: "e85eca51-eae1-4ef0-b008-1a1e1b796b4c") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:36.364979 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.364856 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:36.364979 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.364872 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67db9885f4-znz5z: secret "image-registry-tls" not found Apr 16 19:54:36.364979 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.364923 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls podName:015a1a89-29e1-449f-b569-19b3cce360b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.364908173 +0000 UTC m=+48.779483002 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls") pod "image-registry-67db9885f4-znz5z" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4") : secret "image-registry-tls" not found Apr 16 19:54:36.426362 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.426321 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2987n" event={"ID":"f9d926bb-1dbb-44e0-981e-4bc43df8b1e0","Type":"ContainerStarted","Data":"056a650b33b88909738003d9b419177551b8004bc0ffe73c7cd023d27b51874b"} Apr 16 19:54:36.428035 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.428012 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/0.log" Apr 16 19:54:36.428168 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.428048 2569 generic.go:358] "Generic (PLEG): container finished" podID="ce138de6-668e-4e27-b7d0-579a176ea2f2" containerID="094661d9f835456898c12abb5c7eae54be2f4e4b0f30c805cded9d36397950cb" exitCode=255 Apr 16 19:54:36.428168 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.428078 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" event={"ID":"ce138de6-668e-4e27-b7d0-579a176ea2f2","Type":"ContainerDied","Data":"094661d9f835456898c12abb5c7eae54be2f4e4b0f30c805cded9d36397950cb"} Apr 16 19:54:36.428570 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.428399 2569 scope.go:117] "RemoveContainer" containerID="094661d9f835456898c12abb5c7eae54be2f4e4b0f30c805cded9d36397950cb" Apr 16 19:54:36.429735 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.429689 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k4s7x" event={"ID":"c7b46a8f-9a8f-42e0-971b-334f467cc56f","Type":"ContainerStarted","Data":"1940bc45ed6c6eeb5f55afca3036ba236faaa640cd34e7b6fb9fca3f9e8b69c0"} Apr 16 19:54:36.431326 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.431275 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" event={"ID":"95f11675-707c-4777-8e40-73a4b72aadc9","Type":"ContainerStarted","Data":"52d26845b2c6698ef8a034b324678635b24acb4d1e174d5771ede82663831b91"} Apr 16 19:54:36.433188 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.432985 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" event={"ID":"9a0704cd-b28c-4d5b-9e72-79fcd84527b4","Type":"ContainerStarted","Data":"5cdfbeee74934202e9e58d848a97889833bee873fb1842cafb235eb0001e42da"} Apr 16 19:54:36.435249 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.435003 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8" event={"ID":"aafa3559-b7eb-410f-91aa-abf590bd5c4a","Type":"ContainerStarted","Data":"1d84d3950164453adee9647dfbe4275ac92c47b0f56abfa1c8c21f0379bb236c"} Apr 16 19:54:36.436611 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.436379 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc" event={"ID":"8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4","Type":"ContainerStarted","Data":"b6ad06ed864db5ae5e0c8bcdb0e0ab9f76f2b7d5fc8d906923dff0465b6c98bd"} Apr 16 19:54:36.449337 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.449228 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2987n" podStartSLOduration=9.910586091 podStartE2EDuration="40.449208538s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:53:58.728021279 +0000 UTC m=+3.142596106" lastFinishedPulling="2026-04-16 19:54:29.266643709 +0000 UTC m=+33.681218553" observedRunningTime="2026-04-16 19:54:36.447699249 +0000 UTC m=+40.862274118" watchObservedRunningTime="2026-04-16 19:54:36.449208538 +0000 UTC m=+40.863783391" Apr 16 19:54:36.465351 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.465311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:36.465519 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.465436 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:36.465519 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.465495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:36.465655 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.465542 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:36.465757 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.465721 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:36.465813 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.465780 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert podName:994f79b6-31dc-4b4f-8c42-e6e60bee90cf nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.465763144 +0000 UTC m=+48.880337976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert") pod "ingress-canary-k262c" (UID: "994f79b6-31dc-4b4f-8c42-e6e60bee90cf") : secret "canary-serving-cert" not found Apr 16 19:54:36.466228 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.466207 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.46619109 +0000 UTC m=+48.880765921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:36.466367 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.466292 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:36.466367 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.466326 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert podName:8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.466315531 +0000 UTC m=+48.880890357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rk7vq" (UID: "8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6") : secret "networking-console-plugin-cert" not found Apr 16 19:54:36.466477 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.466376 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:36.466477 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.466418 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.466407671 +0000 UTC m=+48.880982502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : secret "router-metrics-certs-default" not found Apr 16 19:54:36.467887 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.467843 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" podStartSLOduration=24.948093109 podStartE2EDuration="31.467827013s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.249113114 +0000 UTC m=+33.663687940" lastFinishedPulling="2026-04-16 19:54:35.768847002 +0000 UTC m=+40.183421844" observedRunningTime="2026-04-16 19:54:36.467017358 +0000 UTC m=+40.881592207" watchObservedRunningTime="2026-04-16 19:54:36.467827013 +0000 UTC m=+40.882401894" Apr 16 19:54:36.490203 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.490141 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hj2q8" podStartSLOduration=24.960125452 podStartE2EDuration="31.490121193s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.236787957 +0000 UTC m=+33.651362783" lastFinishedPulling="2026-04-16 19:54:35.766783696 +0000 UTC m=+40.181358524" observedRunningTime="2026-04-16 19:54:36.489949129 +0000 UTC m=+40.904523968" watchObservedRunningTime="2026-04-16 19:54:36.490121193 +0000 UTC m=+40.904696050" Apr 16 19:54:36.507042 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.506860 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-k4s7x" podStartSLOduration=24.984615893 podStartE2EDuration="31.506839965s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.250409243 +0000 UTC m=+33.664984069" lastFinishedPulling="2026-04-16 19:54:35.772633314 +0000 UTC m=+40.187208141" observedRunningTime="2026-04-16 19:54:36.506415431 +0000 UTC m=+40.920990280" watchObservedRunningTime="2026-04-16 19:54:36.506839965 +0000 UTC m=+40.921414816" Apr 16 19:54:36.547858 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.547802 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjmcc" podStartSLOduration=23.68929984 podStartE2EDuration="30.54778353s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.236871003 +0000 UTC m=+33.651445829" lastFinishedPulling="2026-04-16 19:54:36.095354684 +0000 UTC m=+40.509929519" observedRunningTime="2026-04-16 19:54:36.546353315 +0000 UTC m=+40.960928166" watchObservedRunningTime="2026-04-16 19:54:36.54778353 +0000 UTC m=+40.962358377" Apr 16 19:54:36.569443 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.567648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:36.569443 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.569095 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:36.569443 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:36.569154 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls podName:28edf32a-262a-4c91-89da-2c452e8d1152 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:44.569135578 +0000 UTC m=+48.983710412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls") pod "dns-default-w77xr" (UID: "28edf32a-262a-4c91-89da-2c452e8d1152") : secret "dns-default-metrics-tls" not found Apr 16 19:54:36.571674 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:36.571595 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" podStartSLOduration=25.068712283 podStartE2EDuration="31.571561394s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.266109942 +0000 UTC m=+33.680684768" lastFinishedPulling="2026-04-16 19:54:35.768959047 +0000 UTC m=+40.183533879" observedRunningTime="2026-04-16 19:54:36.569406665 +0000 UTC m=+40.983981516" watchObservedRunningTime="2026-04-16 19:54:36.571561394 +0000 UTC m=+40.986136245" Apr 16 19:54:37.440446 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.440407 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d724r" event={"ID":"a0b5f3c5-7848-4283-b7e8-31a5e5f79888","Type":"ContainerStarted","Data":"423364dea1d3a70273095437294f0dbb6d4e35a57e323eb42db574530e0955ca"} Apr 16 19:54:37.441000 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.440981 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:54:37.441920 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.441891 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/1.log" Apr 16 19:54:37.442302 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.442283 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/0.log" Apr 16 19:54:37.442363 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.442324 2569 generic.go:358] "Generic (PLEG): container finished" podID="ce138de6-668e-4e27-b7d0-579a176ea2f2" containerID="0a1cff8fe70c99526eee8a3536863b2b1d33532e2a820b2a4a025834a9a4456f" exitCode=255 Apr 16 19:54:37.442485 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.442462 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" event={"ID":"ce138de6-668e-4e27-b7d0-579a176ea2f2","Type":"ContainerDied","Data":"0a1cff8fe70c99526eee8a3536863b2b1d33532e2a820b2a4a025834a9a4456f"} Apr 16 19:54:37.442539 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.442506 2569 scope.go:117] "RemoveContainer" containerID="094661d9f835456898c12abb5c7eae54be2f4e4b0f30c805cded9d36397950cb" Apr 16 19:54:37.442670 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.442650 2569 scope.go:117] "RemoveContainer" containerID="0a1cff8fe70c99526eee8a3536863b2b1d33532e2a820b2a4a025834a9a4456f" Apr 16 19:54:37.442894 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:37.442870 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjrs6_openshift-console-operator(ce138de6-668e-4e27-b7d0-579a176ea2f2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" podUID="ce138de6-668e-4e27-b7d0-579a176ea2f2" Apr 16 19:54:37.468229 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:37.468157 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-d724r" podStartSLOduration=35.556642179 podStartE2EDuration="41.468135925s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:54:30.420522559 +0000 UTC m=+34.835097391" lastFinishedPulling="2026-04-16 19:54:36.332016299 +0000 UTC m=+40.746591137" observedRunningTime="2026-04-16 19:54:37.467312814 +0000 UTC m=+41.881887654" watchObservedRunningTime="2026-04-16 19:54:37.468135925 +0000 UTC m=+41.882710774" Apr 16 19:54:38.446598 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:38.446548 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/1.log" Apr 16 19:54:38.447042 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:38.446925 2569 scope.go:117] "RemoveContainer" containerID="0a1cff8fe70c99526eee8a3536863b2b1d33532e2a820b2a4a025834a9a4456f" Apr 16 19:54:38.447125 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:38.447106 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjrs6_openshift-console-operator(ce138de6-668e-4e27-b7d0-579a176ea2f2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" podUID="ce138de6-668e-4e27-b7d0-579a176ea2f2" Apr 16 19:54:38.950464 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:38.950421 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:38.950464 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:38.950460 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:39.061597 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:39.061548 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tg5c9_448cba83-b62d-4e46-b69b-e948817d0ec5/dns-node-resolver/0.log" Apr 16 19:54:39.449541 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:39.449514 2569 scope.go:117] "RemoveContainer" containerID="0a1cff8fe70c99526eee8a3536863b2b1d33532e2a820b2a4a025834a9a4456f" Apr 16 19:54:39.449918 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:39.449737 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjrs6_openshift-console-operator(ce138de6-668e-4e27-b7d0-579a176ea2f2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" podUID="ce138de6-668e-4e27-b7d0-579a176ea2f2" Apr 16 19:54:40.062638 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:40.062614 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fphk8_1b5d2585-0759-49e0-8726-9b1f8902ebcf/node-ca/0.log" Apr 16 19:54:44.339181 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:44.339138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:54:44.339566 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.339267 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:44.339566 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.339333 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls podName:04afe8e9-f130-46a2-86e8-733e2a491106 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.339318614 +0000 UTC m=+64.753893445 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gqzdl" (UID: "04afe8e9-f130-46a2-86e8-733e2a491106") : secret "samples-operator-tls" not found Apr 16 19:54:44.439694 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:44.439648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:54:44.439694 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:44.439696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:54:44.439896 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.439812 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:44.439932 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.439898 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls podName:e85eca51-eae1-4ef0-b008-1a1e1b796b4c nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.439880997 +0000 UTC m=+64.854455823 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-49lhf" (UID: "e85eca51-eae1-4ef0-b008-1a1e1b796b4c") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:44.439932 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.439816 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:44.439932 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.439913 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67db9885f4-znz5z: secret "image-registry-tls" not found Apr 16 19:54:44.440028 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.439961 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls podName:015a1a89-29e1-449f-b569-19b3cce360b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.439949712 +0000 UTC m=+64.854524538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls") pod "image-registry-67db9885f4-znz5z" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4") : secret "image-registry-tls" not found Apr 16 19:54:44.541005 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:44.540964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:54:44.541005 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:44.541012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:44.541222 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:44.541044 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:54:44.541222 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:44.541089 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:54:44.541222 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.541116 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:44.541222 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.541171 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:44.541222 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.541191 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert podName:8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.54117067 +0000 UTC m=+64.955745496 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rk7vq" (UID: "8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6") : secret "networking-console-plugin-cert" not found Apr 16 19:54:44.541222 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.541210 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.541201071 +0000 UTC m=+64.955775897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:44.541222 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.541210 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:44.541482 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.541248 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs podName:a0b4741a-f02e-48e4-a491-d2ae897b44dd nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.541230998 +0000 UTC m=+64.955805824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs") pod "router-default-77c5ff64cd-kkjnk" (UID: "a0b4741a-f02e-48e4-a491-d2ae897b44dd") : secret "router-metrics-certs-default" not found Apr 16 19:54:44.541482 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.541264 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert podName:994f79b6-31dc-4b4f-8c42-e6e60bee90cf nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.541257496 +0000 UTC m=+64.955832321 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert") pod "ingress-canary-k262c" (UID: "994f79b6-31dc-4b4f-8c42-e6e60bee90cf") : secret "canary-serving-cert" not found Apr 16 19:54:44.641715 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:44.641626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:54:44.641838 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.641772 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:44.641838 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:44.641832 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls podName:28edf32a-262a-4c91-89da-2c452e8d1152 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:00.641816521 +0000 UTC m=+65.056391358 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls") pod "dns-default-w77xr" (UID: "28edf32a-262a-4c91-89da-2c452e8d1152") : secret "dns-default-metrics-tls" not found Apr 16 19:54:51.179965 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:51.179929 2569 scope.go:117] "RemoveContainer" containerID="0a1cff8fe70c99526eee8a3536863b2b1d33532e2a820b2a4a025834a9a4456f" Apr 16 19:54:51.479379 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:51.479306 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 19:54:51.479812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:51.479796 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/1.log" Apr 16 19:54:51.479859 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:51.479833 2569 generic.go:358] "Generic (PLEG): container finished" podID="ce138de6-668e-4e27-b7d0-579a176ea2f2" containerID="d31aeaf8b47be802dc3f8a1eda92072c61215509f95553345b5c8f5ae06f241e" exitCode=255 Apr 16 19:54:51.479918 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:51.479904 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" event={"ID":"ce138de6-668e-4e27-b7d0-579a176ea2f2","Type":"ContainerDied","Data":"d31aeaf8b47be802dc3f8a1eda92072c61215509f95553345b5c8f5ae06f241e"} Apr 16 19:54:51.479959 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:51.479945 2569 scope.go:117] "RemoveContainer" containerID="0a1cff8fe70c99526eee8a3536863b2b1d33532e2a820b2a4a025834a9a4456f" Apr 16 19:54:51.480310 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:51.480291 2569 scope.go:117] "RemoveContainer" containerID="d31aeaf8b47be802dc3f8a1eda92072c61215509f95553345b5c8f5ae06f241e" Apr 16 19:54:51.480530 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:51.480502 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjrs6_openshift-console-operator(ce138de6-668e-4e27-b7d0-579a176ea2f2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" podUID="ce138de6-668e-4e27-b7d0-579a176ea2f2" Apr 16 19:54:52.484185 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:52.484154 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 19:54:54.385250 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:54.385221 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pp78x" Apr 16 19:54:58.949965 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:58.949929 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:58.949965 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:58.949971 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:54:58.950456 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:54:58.950309 2569 scope.go:117] "RemoveContainer" containerID="d31aeaf8b47be802dc3f8a1eda92072c61215509f95553345b5c8f5ae06f241e" Apr 16 19:54:58.950504 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:54:58.950488 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjrs6_openshift-console-operator(ce138de6-668e-4e27-b7d0-579a176ea2f2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" podUID="ce138de6-668e-4e27-b7d0-579a176ea2f2" Apr 16 19:55:00.384650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.384603 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:55:00.387454 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.387425 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04afe8e9-f130-46a2-86e8-733e2a491106-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gqzdl\" (UID: \"04afe8e9-f130-46a2-86e8-733e2a491106\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:55:00.486076 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.486035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:55:00.486076 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.486082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:55:00.488483 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.488455 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e85eca51-eae1-4ef0-b008-1a1e1b796b4c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-49lhf\" (UID: \"e85eca51-eae1-4ef0-b008-1a1e1b796b4c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:55:00.488618 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.488529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"image-registry-67db9885f4-znz5z\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:55:00.587231 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.587188 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:55:00.587231 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.587240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:00.587457 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.587397 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:55:00.587510 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.587483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:00.588106 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.588081 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0b4741a-f02e-48e4-a491-d2ae897b44dd-service-ca-bundle\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:00.589795 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.589775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0b4741a-f02e-48e4-a491-d2ae897b44dd-metrics-certs\") pod \"router-default-77c5ff64cd-kkjnk\" (UID: \"a0b4741a-f02e-48e4-a491-d2ae897b44dd\") " pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:00.590058 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.590037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rk7vq\" (UID: \"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:55:00.590099 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.590037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/994f79b6-31dc-4b4f-8c42-e6e60bee90cf-cert\") pod \"ingress-canary-k262c\" (UID: \"994f79b6-31dc-4b4f-8c42-e6e60bee90cf\") " pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:55:00.615234 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.615203 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-mh28h\"" Apr 16 19:55:00.623874 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.623848 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" Apr 16 19:55:00.680197 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.680164 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nmmmf\"" Apr 16 19:55:00.689003 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.688509 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" Apr 16 19:55:00.689804 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.689316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:55:00.691775 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.691748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28edf32a-262a-4c91-89da-2c452e8d1152-metrics-tls\") pod \"dns-default-w77xr\" (UID: \"28edf32a-262a-4c91-89da-2c452e8d1152\") " pod="openshift-dns/dns-default-w77xr" Apr 16 19:55:00.718559 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.718522 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lq4lt\"" Apr 16 19:55:00.727385 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.726505 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:55:00.735131 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.735096 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-s6qtx\"" Apr 16 19:55:00.743830 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.743772 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:00.748951 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.748892 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl"] Apr 16 19:55:00.799397 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.799107 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-x7lzn\"" Apr 16 19:55:00.807503 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.807226 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" Apr 16 19:55:00.815546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.815270 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfkbf\"" Apr 16 19:55:00.823983 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.823904 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k262c" Apr 16 19:55:00.830098 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.829813 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-668h7\"" Apr 16 19:55:00.830098 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.829810 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf"] Apr 16 19:55:00.836860 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:00.836685 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode85eca51_eae1_4ef0_b008_1a1e1b796b4c.slice/crio-5fb347de057a84475eaa2310d581507cbc344c225134c87f0023672c1ab73915 WatchSource:0}: Error finding container 5fb347de057a84475eaa2310d581507cbc344c225134c87f0023672c1ab73915: Status 404 returned error can't find the container with id 5fb347de057a84475eaa2310d581507cbc344c225134c87f0023672c1ab73915 Apr 16 19:55:00.838043 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.838018 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w77xr" Apr 16 19:55:00.904141 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.904042 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67db9885f4-znz5z"] Apr 16 19:55:00.913189 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:00.913097 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015a1a89_29e1_449f_b569_19b3cce360b4.slice/crio-873be18731a2cf480638b7645ab090d3b6f0e9ec0b03c9ae131d0e4e573b6d12 WatchSource:0}: Error finding container 873be18731a2cf480638b7645ab090d3b6f0e9ec0b03c9ae131d0e4e573b6d12: Status 404 returned error can't find the container with id 873be18731a2cf480638b7645ab090d3b6f0e9ec0b03c9ae131d0e4e573b6d12 Apr 16 19:55:00.940943 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.940703 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-77c5ff64cd-kkjnk"] Apr 16 19:55:00.949096 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:00.949055 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b4741a_f02e_48e4_a491_d2ae897b44dd.slice/crio-6098a9856056693cb52eeb87819dfe6810fe3577bbe50a41b47dbba5b8d83fc4 WatchSource:0}: Error finding container 6098a9856056693cb52eeb87819dfe6810fe3577bbe50a41b47dbba5b8d83fc4: Status 404 returned error can't find the container with id 6098a9856056693cb52eeb87819dfe6810fe3577bbe50a41b47dbba5b8d83fc4 Apr 16 19:55:00.988516 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:00.988480 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq"] Apr 16 19:55:00.992144 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:00.992102 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ebca5c0_30bf_46c6_a6e8_4cc5860aa2d6.slice/crio-a501c4a7b47762e3e30afec63c866ae2da5207cd4cb2c95767c61a3d0fc3c8a2 WatchSource:0}: Error finding container a501c4a7b47762e3e30afec63c866ae2da5207cd4cb2c95767c61a3d0fc3c8a2: Status 404 returned error can't find the container with id a501c4a7b47762e3e30afec63c866ae2da5207cd4cb2c95767c61a3d0fc3c8a2 Apr 16 19:55:01.011831 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.011804 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k262c"] Apr 16 19:55:01.016627 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:01.016594 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994f79b6_31dc_4b4f_8c42_e6e60bee90cf.slice/crio-d28edbd059c8086d3050f93668f6bbcab50dd69b5671f6364568698e2db8c9fc WatchSource:0}: Error finding container d28edbd059c8086d3050f93668f6bbcab50dd69b5671f6364568698e2db8c9fc: Status 404 returned error can't find the container with id d28edbd059c8086d3050f93668f6bbcab50dd69b5671f6364568698e2db8c9fc Apr 16 19:55:01.046890 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.046859 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w77xr"] Apr 16 19:55:01.049711 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:01.049679 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28edf32a_262a_4c91_89da_2c452e8d1152.slice/crio-b251bb113a3a3e6f5c6f2c7125ca9e37d6f7b69d784053ceeafcb7b6907de953 WatchSource:0}: Error finding container b251bb113a3a3e6f5c6f2c7125ca9e37d6f7b69d784053ceeafcb7b6907de953: Status 404 returned error can't find the container with id b251bb113a3a3e6f5c6f2c7125ca9e37d6f7b69d784053ceeafcb7b6907de953 Apr 16 19:55:01.515022 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.514980 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" event={"ID":"a0b4741a-f02e-48e4-a491-d2ae897b44dd","Type":"ContainerStarted","Data":"541a8a6a87b7733e2f04752fbfa8468a9916f917e5a4f0a4be2af24adf57dfda"} Apr 16 19:55:01.515615 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.515030 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" event={"ID":"a0b4741a-f02e-48e4-a491-d2ae897b44dd","Type":"ContainerStarted","Data":"6098a9856056693cb52eeb87819dfe6810fe3577bbe50a41b47dbba5b8d83fc4"} Apr 16 19:55:01.518985 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.518848 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" event={"ID":"015a1a89-29e1-449f-b569-19b3cce360b4","Type":"ContainerStarted","Data":"f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93"} Apr 16 19:55:01.518985 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.518891 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" event={"ID":"015a1a89-29e1-449f-b569-19b3cce360b4","Type":"ContainerStarted","Data":"873be18731a2cf480638b7645ab090d3b6f0e9ec0b03c9ae131d0e4e573b6d12"} Apr 16 19:55:01.518985 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.518950 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:55:01.521328 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.521293 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" event={"ID":"04afe8e9-f130-46a2-86e8-733e2a491106","Type":"ContainerStarted","Data":"c27bcdd2cb71028dd11a0a119aff0f9e7011eab2895e93259aa160c84dceb13a"} Apr 16 19:55:01.524345 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.524312 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k262c" event={"ID":"994f79b6-31dc-4b4f-8c42-e6e60bee90cf","Type":"ContainerStarted","Data":"d28edbd059c8086d3050f93668f6bbcab50dd69b5671f6364568698e2db8c9fc"} Apr 16 19:55:01.526246 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.526215 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" event={"ID":"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6","Type":"ContainerStarted","Data":"a501c4a7b47762e3e30afec63c866ae2da5207cd4cb2c95767c61a3d0fc3c8a2"} Apr 16 19:55:01.527712 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.527671 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w77xr" event={"ID":"28edf32a-262a-4c91-89da-2c452e8d1152","Type":"ContainerStarted","Data":"b251bb113a3a3e6f5c6f2c7125ca9e37d6f7b69d784053ceeafcb7b6907de953"} Apr 16 19:55:01.529521 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.529470 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" event={"ID":"e85eca51-eae1-4ef0-b008-1a1e1b796b4c","Type":"ContainerStarted","Data":"5fb347de057a84475eaa2310d581507cbc344c225134c87f0023672c1ab73915"} Apr 16 19:55:01.538060 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.537480 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" podStartSLOduration=56.537458838 podStartE2EDuration="56.537458838s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:01.536183614 +0000 UTC m=+65.950758457" watchObservedRunningTime="2026-04-16 19:55:01.537458838 +0000 UTC m=+65.952033669" Apr 16 19:55:01.744699 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.744630 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:01.748208 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.747969 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:01.771884 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.770661 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" podStartSLOduration=65.770640733 podStartE2EDuration="1m5.770640733s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:01.567039242 +0000 UTC m=+65.981614092" watchObservedRunningTime="2026-04-16 19:55:01.770640733 +0000 UTC m=+66.185215585" Apr 16 19:55:01.800725 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.800680 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:55:01.800929 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.800744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:55:01.803629 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.803384 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:55:01.804064 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.803869 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:55:01.815398 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.815330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10356841-c032-4d12-8328-dc3aeb909c86-metrics-certs\") pod \"network-metrics-daemon-8jmq5\" (UID: \"10356841-c032-4d12-8328-dc3aeb909c86\") " pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:55:01.817415 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:01.817352 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b7a86d58-955a-4af8-9ae0-c6e786f43b28-original-pull-secret\") pod \"global-pull-secret-syncer-zkxnw\" (UID: \"b7a86d58-955a-4af8-9ae0-c6e786f43b28\") " pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:55:02.005126 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:02.004862 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nnp77\"" Apr 16 19:55:02.013494 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:02.013414 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8jmq5" Apr 16 19:55:02.020940 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:02.020911 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zkxnw" Apr 16 19:55:02.532637 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:02.532601 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:02.534207 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:02.534181 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-77c5ff64cd-kkjnk" Apr 16 19:55:03.353555 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.353520 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zdtb2"] Apr 16 19:55:03.376096 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.376058 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zdtb2"] Apr 16 19:55:03.376270 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.376226 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.378677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.378561 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:55:03.378921 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.378848 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:55:03.379328 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.379301 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pxnkg\"" Apr 16 19:55:03.515609 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.515516 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bdxd\" (UniqueName: \"kubernetes.io/projected/dc451bbf-6087-4118-8c46-7dd3dde99f7a-kube-api-access-6bdxd\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.515609 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.515587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc451bbf-6087-4118-8c46-7dd3dde99f7a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.515826 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.515637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc451bbf-6087-4118-8c46-7dd3dde99f7a-data-volume\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.515826 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.515668 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc451bbf-6087-4118-8c46-7dd3dde99f7a-crio-socket\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.515826 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.515689 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc451bbf-6087-4118-8c46-7dd3dde99f7a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.616410 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.616373 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc451bbf-6087-4118-8c46-7dd3dde99f7a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.616410 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.616416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc451bbf-6087-4118-8c46-7dd3dde99f7a-data-volume\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.617046 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.616449 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc451bbf-6087-4118-8c46-7dd3dde99f7a-crio-socket\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.617046 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.616480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc451bbf-6087-4118-8c46-7dd3dde99f7a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.617046 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.616568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bdxd\" (UniqueName: \"kubernetes.io/projected/dc451bbf-6087-4118-8c46-7dd3dde99f7a-kube-api-access-6bdxd\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.617046 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.616596 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc451bbf-6087-4118-8c46-7dd3dde99f7a-crio-socket\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.617046 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.616923 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc451bbf-6087-4118-8c46-7dd3dde99f7a-data-volume\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.617226 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.617161 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc451bbf-6087-4118-8c46-7dd3dde99f7a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.618833 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.618811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc451bbf-6087-4118-8c46-7dd3dde99f7a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.631133 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.631099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bdxd\" (UniqueName: \"kubernetes.io/projected/dc451bbf-6087-4118-8c46-7dd3dde99f7a-kube-api-access-6bdxd\") pod \"insights-runtime-extractor-zdtb2\" (UID: \"dc451bbf-6087-4118-8c46-7dd3dde99f7a\") " pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:03.689100 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:03.689067 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zdtb2" Apr 16 19:55:05.607737 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:05.605295 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8jmq5"] Apr 16 19:55:05.641277 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:05.641197 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zkxnw"] Apr 16 19:55:05.643691 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:05.642896 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zdtb2"] Apr 16 19:55:05.647055 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:05.646822 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a86d58_955a_4af8_9ae0_c6e786f43b28.slice/crio-c65ff4cbb1e88f8746c96d37a5ad70fec954af4e903f85f8e460775081fe8e31 WatchSource:0}: Error finding container c65ff4cbb1e88f8746c96d37a5ad70fec954af4e903f85f8e460775081fe8e31: Status 404 returned error can't find the container with id c65ff4cbb1e88f8746c96d37a5ad70fec954af4e903f85f8e460775081fe8e31 Apr 16 19:55:06.555817 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.555534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" event={"ID":"04afe8e9-f130-46a2-86e8-733e2a491106","Type":"ContainerStarted","Data":"9ca036bc42cece26faa178998e74670bc623e5ca17b7f149cdc3f31a3a2e639d"} Apr 16 19:55:06.555817 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.555596 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" event={"ID":"04afe8e9-f130-46a2-86e8-733e2a491106","Type":"ContainerStarted","Data":"598c85bca6784aa3ea5651c4d572127ae0f53a654f3f5d7479ea9ff5d420b10d"} Apr 16 19:55:06.557925 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.557875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k262c" event={"ID":"994f79b6-31dc-4b4f-8c42-e6e60bee90cf","Type":"ContainerStarted","Data":"2bd34f9f9c25ff503b8d093a79161736d6edfaaa7dccf618b3c45cda8d4d895b"} Apr 16 19:55:06.559695 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.559655 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" event={"ID":"8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6","Type":"ContainerStarted","Data":"58177dcf67aa29d60c096f6305c58860113078b71a1e8f0a6065910486ef48c9"} Apr 16 19:55:06.561763 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.561692 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w77xr" event={"ID":"28edf32a-262a-4c91-89da-2c452e8d1152","Type":"ContainerStarted","Data":"b8b9365a0c3c19a919cc92193d874eb28af0577370c88ad8e4921a1b3e1cd509"} Apr 16 19:55:06.561763 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.561722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w77xr" event={"ID":"28edf32a-262a-4c91-89da-2c452e8d1152","Type":"ContainerStarted","Data":"3babda5881903c780136f6536764ed37f48faf47340e7a3ad52c34ff3f251d18"} Apr 16 19:55:06.562105 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.562090 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w77xr" Apr 16 19:55:06.564210 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.564182 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" event={"ID":"e85eca51-eae1-4ef0-b008-1a1e1b796b4c","Type":"ContainerStarted","Data":"ca7f9df44aaf32d7346e1e68f519f33a3493819dfcd1ab216db5a4a806aadf59"} Apr 16 19:55:06.565554 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.565522 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8jmq5" event={"ID":"10356841-c032-4d12-8328-dc3aeb909c86","Type":"ContainerStarted","Data":"6279e176e947e1d314ed9746228c631088ee3bbea9e60ed21f1e71a12563180d"} Apr 16 19:55:06.567112 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.567088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zdtb2" event={"ID":"dc451bbf-6087-4118-8c46-7dd3dde99f7a","Type":"ContainerStarted","Data":"d33b9261df26c8aba741455537c7b1b55b2d67274dadb1a8ce20e559c31c5a71"} Apr 16 19:55:06.567224 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.567119 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zdtb2" event={"ID":"dc451bbf-6087-4118-8c46-7dd3dde99f7a","Type":"ContainerStarted","Data":"51c1b2aa8a5b3147ee0fb5b7dc846e045c6e6cb8e77c1e24a6854bc7a92a5544"} Apr 16 19:55:06.568554 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.568524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zkxnw" event={"ID":"b7a86d58-955a-4af8-9ae0-c6e786f43b28","Type":"ContainerStarted","Data":"c65ff4cbb1e88f8746c96d37a5ad70fec954af4e903f85f8e460775081fe8e31"} Apr 16 19:55:06.573765 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.573712 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gqzdl" podStartSLOduration=56.98348918 podStartE2EDuration="1m1.573693094s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="2026-04-16 19:55:00.844688889 +0000 UTC m=+65.259263721" lastFinishedPulling="2026-04-16 19:55:05.434892798 +0000 UTC m=+69.849467635" observedRunningTime="2026-04-16 19:55:06.573556772 +0000 UTC m=+70.988131621" watchObservedRunningTime="2026-04-16 19:55:06.573693094 +0000 UTC m=+70.988267943" Apr 16 19:55:06.593403 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.593197 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-49lhf" podStartSLOduration=56.999325312 podStartE2EDuration="1m1.593174499s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="2026-04-16 19:55:00.841028346 +0000 UTC m=+65.255603175" lastFinishedPulling="2026-04-16 19:55:05.434877535 +0000 UTC m=+69.849452362" observedRunningTime="2026-04-16 19:55:06.591613894 +0000 UTC m=+71.006188743" watchObservedRunningTime="2026-04-16 19:55:06.593174499 +0000 UTC m=+71.007749348" Apr 16 19:55:06.617718 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.617658 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rk7vq" podStartSLOduration=51.182118662 podStartE2EDuration="55.617636728s" podCreationTimestamp="2026-04-16 19:54:11 +0000 UTC" firstStartedPulling="2026-04-16 19:55:00.994034089 +0000 UTC m=+65.408608915" lastFinishedPulling="2026-04-16 19:55:05.429552153 +0000 UTC m=+69.844126981" observedRunningTime="2026-04-16 19:55:06.616050802 +0000 UTC m=+71.030625651" watchObservedRunningTime="2026-04-16 19:55:06.617636728 +0000 UTC m=+71.032211575" Apr 16 19:55:06.637003 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.636893 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k262c" podStartSLOduration=34.219753139 podStartE2EDuration="38.63687239s" podCreationTimestamp="2026-04-16 19:54:28 +0000 UTC" firstStartedPulling="2026-04-16 19:55:01.018793222 +0000 UTC m=+65.433368049" lastFinishedPulling="2026-04-16 19:55:05.435912472 +0000 UTC m=+69.850487300" observedRunningTime="2026-04-16 19:55:06.635324024 +0000 UTC m=+71.049898873" watchObservedRunningTime="2026-04-16 19:55:06.63687239 +0000 UTC m=+71.051447238" Apr 16 19:55:06.651868 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:06.651807 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w77xr" podStartSLOduration=34.26757485 podStartE2EDuration="38.651784082s" podCreationTimestamp="2026-04-16 19:54:28 +0000 UTC" firstStartedPulling="2026-04-16 19:55:01.051708625 +0000 UTC m=+65.466283451" lastFinishedPulling="2026-04-16 19:55:05.435917853 +0000 UTC m=+69.850492683" observedRunningTime="2026-04-16 19:55:06.650633347 +0000 UTC m=+71.065208191" watchObservedRunningTime="2026-04-16 19:55:06.651784082 +0000 UTC m=+71.066358928" Apr 16 19:55:08.450154 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:08.450116 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-d724r" Apr 16 19:55:08.577548 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:08.577505 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zdtb2" event={"ID":"dc451bbf-6087-4118-8c46-7dd3dde99f7a","Type":"ContainerStarted","Data":"c8c593a9f7037b29160215f75760c6df54277b670dc8776df286b4cdafa502a6"} Apr 16 19:55:08.579389 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:08.579355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8jmq5" event={"ID":"10356841-c032-4d12-8328-dc3aeb909c86","Type":"ContainerStarted","Data":"905f30e2088b989174fa723539a7165cafe83e07f8c9f2c0157efb516eaa0c0f"} Apr 16 19:55:08.579389 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:08.579393 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8jmq5" event={"ID":"10356841-c032-4d12-8328-dc3aeb909c86","Type":"ContainerStarted","Data":"afd34a6a4a1b2f14079d12bf93ff08d3b618efd84751116c9025b27c2842a7dd"} Apr 16 19:55:08.607807 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:08.607754 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8jmq5" podStartSLOduration=70.781452018 podStartE2EDuration="1m12.607731453s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:55:05.625800485 +0000 UTC m=+70.040375324" lastFinishedPulling="2026-04-16 19:55:07.45207993 +0000 UTC m=+71.866654759" observedRunningTime="2026-04-16 19:55:08.607532188 +0000 UTC m=+73.022107037" watchObservedRunningTime="2026-04-16 19:55:08.607731453 +0000 UTC m=+73.022306303" Apr 16 19:55:10.184167 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:10.184135 2569 scope.go:117] "RemoveContainer" containerID="d31aeaf8b47be802dc3f8a1eda92072c61215509f95553345b5c8f5ae06f241e" Apr 16 19:55:10.184655 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:55:10.184360 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjrs6_openshift-console-operator(ce138de6-668e-4e27-b7d0-579a176ea2f2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" podUID="ce138de6-668e-4e27-b7d0-579a176ea2f2" Apr 16 19:55:10.587642 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:10.587570 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zdtb2" event={"ID":"dc451bbf-6087-4118-8c46-7dd3dde99f7a","Type":"ContainerStarted","Data":"f5b90f0c6f76fc5b8b73447f29c0bab9daf7071694872120e508875939fbd749"} Apr 16 19:55:10.589042 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:10.589015 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zkxnw" event={"ID":"b7a86d58-955a-4af8-9ae0-c6e786f43b28","Type":"ContainerStarted","Data":"b46bdea3154d983667be8e57758d3632f81d291cf7648289cbef3f8e4711f225"} Apr 16 19:55:10.628773 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:10.628716 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zdtb2" podStartSLOduration=3.092888574 podStartE2EDuration="7.62870157s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:05.758799755 +0000 UTC m=+70.173374582" lastFinishedPulling="2026-04-16 19:55:10.294612744 +0000 UTC m=+74.709187578" observedRunningTime="2026-04-16 19:55:10.627436488 +0000 UTC m=+75.042011335" watchObservedRunningTime="2026-04-16 19:55:10.62870157 +0000 UTC m=+75.043276422" Apr 16 19:55:10.662416 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:10.662359 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zkxnw" podStartSLOduration=69.006169639 podStartE2EDuration="1m13.662341999s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:55:05.650237447 +0000 UTC m=+70.064812280" lastFinishedPulling="2026-04-16 19:55:10.306409798 +0000 UTC m=+74.720984640" observedRunningTime="2026-04-16 19:55:10.661861172 +0000 UTC m=+75.076436020" watchObservedRunningTime="2026-04-16 19:55:10.662341999 +0000 UTC m=+75.076916846" Apr 16 19:55:14.607755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.607640 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l"] Apr 16 19:55:14.610954 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.610936 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.613389 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.613361 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-9bx7b\"" Apr 16 19:55:14.613669 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.613638 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:55:14.613669 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.613656 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 19:55:14.613927 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.613912 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:55:14.622893 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.622865 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l"] Apr 16 19:55:14.663509 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.663477 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2mssn"] Apr 16 19:55:14.666847 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.666822 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.673747 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.673726 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:14.674009 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.673993 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:14.674974 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.674961 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:14.676385 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.676365 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8qnsr\"" Apr 16 19:55:14.722605 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.722544 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.722605 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.722608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj7d6\" (UniqueName: \"kubernetes.io/projected/e9dd998f-62f0-406e-bf31-cca545dc9b5d-kube-api-access-hj7d6\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.722857 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.722708 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.722857 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.722747 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9dd998f-62f0-406e-bf31-cca545dc9b5d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.823332 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.823550 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823351 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9dd998f-62f0-406e-bf31-cca545dc9b5d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.823550 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823390 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-accelerators-collector-config\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.823550 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:55:14.823445 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 19:55:14.823550 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823438 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-textfile\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.823550 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:55:14.823533 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-tls podName:e9dd998f-62f0-406e-bf31-cca545dc9b5d nodeName:}" failed. No retries permitted until 2026-04-16 19:55:15.32351118 +0000 UTC m=+79.738086014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-4h95l" (UID: "e9dd998f-62f0-406e-bf31-cca545dc9b5d") : secret "openshift-state-metrics-tls" not found Apr 16 19:55:14.823897 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823616 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-wtmp\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.823897 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.823897 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823709 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4vqt\" (UniqueName: \"kubernetes.io/projected/52860252-cf2f-4da1-9834-49ba663cc555-kube-api-access-c4vqt\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.823897 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.823897 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823789 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj7d6\" (UniqueName: \"kubernetes.io/projected/e9dd998f-62f0-406e-bf31-cca545dc9b5d-kube-api-access-hj7d6\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.823897 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-root\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.823897 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823857 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52860252-cf2f-4da1-9834-49ba663cc555-metrics-client-ca\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.824183 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823893 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-sys\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.824183 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.823924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-tls\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.824278 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.824240 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9dd998f-62f0-406e-bf31-cca545dc9b5d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.826264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.826238 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.838753 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.838720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj7d6\" (UniqueName: \"kubernetes.io/projected/e9dd998f-62f0-406e-bf31-cca545dc9b5d-kube-api-access-hj7d6\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:14.924935 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.924893 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4vqt\" (UniqueName: \"kubernetes.io/projected/52860252-cf2f-4da1-9834-49ba663cc555-kube-api-access-c4vqt\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925105 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.924973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-root\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925105 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52860252-cf2f-4da1-9834-49ba663cc555-metrics-client-ca\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925105 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925043 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-sys\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925105 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-root\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925105 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925067 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-tls\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925374 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925120 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-sys\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925374 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-accelerators-collector-config\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925374 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925217 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-textfile\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925374 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925259 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-wtmp\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925374 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925702 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-wtmp\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925783 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52860252-cf2f-4da1-9834-49ba663cc555-metrics-client-ca\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925839 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-accelerators-collector-config\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.925962 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.925937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-textfile\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.928060 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.928036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.928241 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.928219 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52860252-cf2f-4da1-9834-49ba663cc555-node-exporter-tls\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.932823 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.932796 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4vqt\" (UniqueName: \"kubernetes.io/projected/52860252-cf2f-4da1-9834-49ba663cc555-kube-api-access-c4vqt\") pod \"node-exporter-2mssn\" (UID: \"52860252-cf2f-4da1-9834-49ba663cc555\") " pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.975937 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:14.975900 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2mssn" Apr 16 19:55:14.984694 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:14.984659 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52860252_cf2f_4da1_9834_49ba663cc555.slice/crio-91124c1d3775d575abfcab07c6333e5c36df03aecc244ff2b7d7ce9ff8e0289f WatchSource:0}: Error finding container 91124c1d3775d575abfcab07c6333e5c36df03aecc244ff2b7d7ce9ff8e0289f: Status 404 returned error can't find the container with id 91124c1d3775d575abfcab07c6333e5c36df03aecc244ff2b7d7ce9ff8e0289f Apr 16 19:55:15.329789 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.329699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:15.332269 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.332234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9dd998f-62f0-406e-bf31-cca545dc9b5d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4h95l\" (UID: \"e9dd998f-62f0-406e-bf31-cca545dc9b5d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:15.521247 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.521207 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" Apr 16 19:55:15.604163 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.604064 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2mssn" event={"ID":"52860252-cf2f-4da1-9834-49ba663cc555","Type":"ContainerStarted","Data":"91124c1d3775d575abfcab07c6333e5c36df03aecc244ff2b7d7ce9ff8e0289f"} Apr 16 19:55:15.656964 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.653421 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:15.659957 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.659919 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.663147 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663116 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:55:15.663147 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663141 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:55:15.663523 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663494 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4ls6g\"" Apr 16 19:55:15.663523 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663506 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:55:15.663823 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663548 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:55:15.663929 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663856 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:55:15.663929 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663872 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:55:15.664042 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663878 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:55:15.664042 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.663966 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:55:15.664184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.664161 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:55:15.671932 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.671903 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:15.803772 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.803741 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l"] Apr 16 19:55:15.806709 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:15.806678 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9dd998f_62f0_406e_bf31_cca545dc9b5d.slice/crio-dd31c5880162446d4c4ca37ec3b94b12a6534018813a047e33406b252d882864 WatchSource:0}: Error finding container dd31c5880162446d4c4ca37ec3b94b12a6534018813a047e33406b252d882864: Status 404 returned error can't find the container with id dd31c5880162446d4c4ca37ec3b94b12a6534018813a047e33406b252d882864 Apr 16 19:55:15.834034 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.833989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834202 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834041 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834202 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834158 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834317 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834205 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-volume\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834317 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834317 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834299 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834471 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834342 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-web-config\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834471 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834379 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834471 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834459 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-out\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834637 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834482 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834637 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834512 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834637 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834533 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjhd\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-kube-api-access-jtjhd\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.834637 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.834595 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.935765 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.935728 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-out\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.935955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.935774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.935955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.935800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.935955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.935819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjhd\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-kube-api-access-jtjhd\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.935955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.935844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.935955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.935893 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.935955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.935936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.936276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.936000 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.936276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.936035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-volume\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.936276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.936067 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.936276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.936105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.936276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.936150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-web-config\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.936276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.936184 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.936276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.936224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.937646 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.937359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.939313 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.939279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-out\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.939442 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.939318 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.939508 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.939444 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.939566 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.939505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-volume\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.939703 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.939661 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.940285 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.940250 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.940718 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.940694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.941129 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.941087 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.941281 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.941262 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.944439 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.944408 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-web-config\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.946222 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.946195 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjhd\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-kube-api-access-jtjhd\") pod \"alertmanager-main-0\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:15.977604 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:15.977365 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:16.122253 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.122132 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:16.125128 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:16.125093 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2199094c_50c5_4e79_af74_8d28fb1e8b9d.slice/crio-4d03e8f65f5dfc55e9c5c18dccc66959dff09454c3dff2df0aeb07e568cbb5ed WatchSource:0}: Error finding container 4d03e8f65f5dfc55e9c5c18dccc66959dff09454c3dff2df0aeb07e568cbb5ed: Status 404 returned error can't find the container with id 4d03e8f65f5dfc55e9c5c18dccc66959dff09454c3dff2df0aeb07e568cbb5ed Apr 16 19:55:16.609078 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.609022 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerStarted","Data":"4d03e8f65f5dfc55e9c5c18dccc66959dff09454c3dff2df0aeb07e568cbb5ed"} Apr 16 19:55:16.611053 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.611014 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" event={"ID":"e9dd998f-62f0-406e-bf31-cca545dc9b5d","Type":"ContainerStarted","Data":"35eb06e68f2697372ec51eed00d5192b70b6b7a2db240229b15c4a0cfdeda43b"} Apr 16 19:55:16.611221 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.611058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" event={"ID":"e9dd998f-62f0-406e-bf31-cca545dc9b5d","Type":"ContainerStarted","Data":"76501eca5b649eb53f34819432a8f64e7a9ffe9767d4b7b39a1869ca86397aa8"} Apr 16 19:55:16.611221 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.611073 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" event={"ID":"e9dd998f-62f0-406e-bf31-cca545dc9b5d","Type":"ContainerStarted","Data":"dd31c5880162446d4c4ca37ec3b94b12a6534018813a047e33406b252d882864"} Apr 16 19:55:16.613053 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.613015 2569 generic.go:358] "Generic (PLEG): container finished" podID="52860252-cf2f-4da1-9834-49ba663cc555" containerID="5d3ed9aa1a11c225ae5db3788296092bad60fb3cbd997c327003433215495653" exitCode=0 Apr 16 19:55:16.613197 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.613113 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2mssn" event={"ID":"52860252-cf2f-4da1-9834-49ba663cc555","Type":"ContainerDied","Data":"5d3ed9aa1a11c225ae5db3788296092bad60fb3cbd997c327003433215495653"} Apr 16 19:55:16.651465 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.651427 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-54f68c57f4-2vg75"] Apr 16 19:55:16.657452 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.657422 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.661271 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.660233 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 19:55:16.661271 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.660655 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 19:55:16.661271 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.660918 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mjrtv\"" Apr 16 19:55:16.661271 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.661126 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 19:55:16.661271 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.661248 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 19:55:16.661631 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.661346 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 19:55:16.661631 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.661497 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-b43eteo4h1f70\"" Apr 16 19:55:16.665468 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.665436 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-54f68c57f4-2vg75"] Apr 16 19:55:16.843118 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.843080 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xq5p\" (UniqueName: \"kubernetes.io/projected/897a6728-45f7-4fd3-9046-d545dc2704e6-kube-api-access-7xq5p\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.843315 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.843138 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.843315 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.843235 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.843315 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.843283 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-tls\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.843478 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.843321 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/897a6728-45f7-4fd3-9046-d545dc2704e6-metrics-client-ca\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.843478 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.843352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-grpc-tls\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.843478 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.843437 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.843651 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.843498 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.944166 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.944126 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xq5p\" (UniqueName: \"kubernetes.io/projected/897a6728-45f7-4fd3-9046-d545dc2704e6-kube-api-access-7xq5p\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.944360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.944181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.944360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.944219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.944360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.944248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-tls\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.944360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.944279 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/897a6728-45f7-4fd3-9046-d545dc2704e6-metrics-client-ca\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.944360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.944305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-grpc-tls\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.944360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.944346 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.944675 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.944393 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.945365 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.945309 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/897a6728-45f7-4fd3-9046-d545dc2704e6-metrics-client-ca\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.947529 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.947502 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-tls\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.947733 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.947687 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-grpc-tls\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.947958 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.947912 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.948123 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.948096 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.948218 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.948128 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.948399 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.948375 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/897a6728-45f7-4fd3-9046-d545dc2704e6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.951545 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.951523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xq5p\" (UniqueName: \"kubernetes.io/projected/897a6728-45f7-4fd3-9046-d545dc2704e6-kube-api-access-7xq5p\") pod \"thanos-querier-54f68c57f4-2vg75\" (UID: \"897a6728-45f7-4fd3-9046-d545dc2704e6\") " pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:16.986413 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:16.986372 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:17.360792 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.360699 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-54f68c57f4-2vg75"] Apr 16 19:55:17.393613 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:17.393557 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod897a6728_45f7_4fd3_9046_d545dc2704e6.slice/crio-557233c02f21137d8a1b611a6b311b47aafa864f55147010c9ac7fa29a75ad3f WatchSource:0}: Error finding container 557233c02f21137d8a1b611a6b311b47aafa864f55147010c9ac7fa29a75ad3f: Status 404 returned error can't find the container with id 557233c02f21137d8a1b611a6b311b47aafa864f55147010c9ac7fa29a75ad3f Apr 16 19:55:17.581710 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.581628 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w77xr" Apr 16 19:55:17.617550 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.617507 2569 generic.go:358] "Generic (PLEG): container finished" podID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerID="539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1" exitCode=0 Apr 16 19:55:17.617745 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.617598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerDied","Data":"539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1"} Apr 16 19:55:17.618981 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.618950 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" event={"ID":"897a6728-45f7-4fd3-9046-d545dc2704e6","Type":"ContainerStarted","Data":"557233c02f21137d8a1b611a6b311b47aafa864f55147010c9ac7fa29a75ad3f"} Apr 16 19:55:17.621485 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.621453 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" event={"ID":"e9dd998f-62f0-406e-bf31-cca545dc9b5d","Type":"ContainerStarted","Data":"3c7ef8a6edc5fe301ce1883d0d5b1e751ccd1b98578576f5c7d06fdaaf85a336"} Apr 16 19:55:17.624054 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.624022 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2mssn" event={"ID":"52860252-cf2f-4da1-9834-49ba663cc555","Type":"ContainerStarted","Data":"0732d7eccb16bb6002e61e5fbb30bb2b49e5639b12cea6d75f6a0924ad37914d"} Apr 16 19:55:17.624054 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.624049 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2mssn" event={"ID":"52860252-cf2f-4da1-9834-49ba663cc555","Type":"ContainerStarted","Data":"1b15f8fdb667df52a951522024e97dc36d76cdd25e32d2c0670c5ea269390f84"} Apr 16 19:55:17.671694 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.671623 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2mssn" podStartSLOduration=2.9454307 podStartE2EDuration="3.671602704s" podCreationTimestamp="2026-04-16 19:55:14 +0000 UTC" firstStartedPulling="2026-04-16 19:55:14.986321482 +0000 UTC m=+79.400896308" lastFinishedPulling="2026-04-16 19:55:15.712493467 +0000 UTC m=+80.127068312" observedRunningTime="2026-04-16 19:55:17.669719763 +0000 UTC m=+82.084294623" watchObservedRunningTime="2026-04-16 19:55:17.671602704 +0000 UTC m=+82.086177554" Apr 16 19:55:17.688967 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:17.688904 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4h95l" podStartSLOduration=2.504793951 podStartE2EDuration="3.688887975s" podCreationTimestamp="2026-04-16 19:55:14 +0000 UTC" firstStartedPulling="2026-04-16 19:55:16.023069048 +0000 UTC m=+80.437643880" lastFinishedPulling="2026-04-16 19:55:17.207163075 +0000 UTC m=+81.621737904" observedRunningTime="2026-04-16 19:55:17.687376678 +0000 UTC m=+82.101951529" watchObservedRunningTime="2026-04-16 19:55:17.688887975 +0000 UTC m=+82.103462823" Apr 16 19:55:20.637190 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.637150 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerStarted","Data":"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96"} Apr 16 19:55:20.637190 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.637198 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerStarted","Data":"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b"} Apr 16 19:55:20.637692 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.637212 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerStarted","Data":"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02"} Apr 16 19:55:20.637692 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.637224 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerStarted","Data":"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f"} Apr 16 19:55:20.637692 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.637237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerStarted","Data":"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24"} Apr 16 19:55:20.639156 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.639126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" event={"ID":"897a6728-45f7-4fd3-9046-d545dc2704e6","Type":"ContainerStarted","Data":"0b2fea702ac09dc0e339360476419ac5877b7f8c8fe688d62c865f820739bef5"} Apr 16 19:55:20.639276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.639162 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" event={"ID":"897a6728-45f7-4fd3-9046-d545dc2704e6","Type":"ContainerStarted","Data":"5aa62eedbbabf4289ba2656533ec0d5edf36b36900cad6709c09b1e24044b3d5"} Apr 16 19:55:20.639276 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.639173 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" event={"ID":"897a6728-45f7-4fd3-9046-d545dc2704e6","Type":"ContainerStarted","Data":"81d3e67472a8bc79dc97f4bca9d39e61e760dd19ef8cfcdf6f9260115cd32da7"} Apr 16 19:55:20.732008 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.731966 2569 patch_prober.go:28] interesting pod/image-registry-67db9885f4-znz5z container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 19:55:20.732129 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:20.732027 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" podUID="015a1a89-29e1-449f-b569-19b3cce360b4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:55:21.179546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.179510 2569 scope.go:117] "RemoveContainer" containerID="d31aeaf8b47be802dc3f8a1eda92072c61215509f95553345b5c8f5ae06f241e" Apr 16 19:55:21.647147 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.647071 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 19:55:21.647605 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.647169 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" event={"ID":"ce138de6-668e-4e27-b7d0-579a176ea2f2","Type":"ContainerStarted","Data":"d8388297f0f53fc8b5c1ad3ef7416c7811ee4f10678fb8eb9cd0a39611af920b"} Apr 16 19:55:21.647737 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.647714 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:55:21.650963 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.650926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerStarted","Data":"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d"} Apr 16 19:55:21.654880 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.654848 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" event={"ID":"897a6728-45f7-4fd3-9046-d545dc2704e6","Type":"ContainerStarted","Data":"1f620e24aea48742b8f1e27b707fa2677c59c756966509aefdf6c5fced647164"} Apr 16 19:55:21.654880 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.654885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" event={"ID":"897a6728-45f7-4fd3-9046-d545dc2704e6","Type":"ContainerStarted","Data":"3de392900a791f9e06b38425b1c49c6eb7644bf32ab3e80d7ea4ed9a4a4c73cb"} Apr 16 19:55:21.655108 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.654899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" event={"ID":"897a6728-45f7-4fd3-9046-d545dc2704e6","Type":"ContainerStarted","Data":"4f58c07404b6cb2cb8273df4ecb62c00806ead55e9e46f11885adf1afee203a1"} Apr 16 19:55:21.655108 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.655017 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:21.710733 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.710662 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" podStartSLOduration=70.209898551 podStartE2EDuration="1m16.710643529s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.268100899 +0000 UTC m=+33.682675724" lastFinishedPulling="2026-04-16 19:54:35.76884586 +0000 UTC m=+40.183420702" observedRunningTime="2026-04-16 19:55:21.672449397 +0000 UTC m=+86.087024244" watchObservedRunningTime="2026-04-16 19:55:21.710643529 +0000 UTC m=+86.125218377" Apr 16 19:55:21.711693 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:21.711653 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.172280709 podStartE2EDuration="6.711633479s" podCreationTimestamp="2026-04-16 19:55:15 +0000 UTC" firstStartedPulling="2026-04-16 19:55:16.127019711 +0000 UTC m=+80.541594537" lastFinishedPulling="2026-04-16 19:55:20.666372458 +0000 UTC m=+85.080947307" observedRunningTime="2026-04-16 19:55:21.708353279 +0000 UTC m=+86.122928129" watchObservedRunningTime="2026-04-16 19:55:21.711633479 +0000 UTC m=+86.126208319" Apr 16 19:55:22.536236 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.536201 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:55:22.547071 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.547043 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-cjrs6" Apr 16 19:55:22.556335 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.556283 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" podStartSLOduration=3.28914604 podStartE2EDuration="6.556266585s" podCreationTimestamp="2026-04-16 19:55:16 +0000 UTC" firstStartedPulling="2026-04-16 19:55:17.395675493 +0000 UTC m=+81.810250322" lastFinishedPulling="2026-04-16 19:55:20.662796029 +0000 UTC m=+85.077370867" observedRunningTime="2026-04-16 19:55:21.737947125 +0000 UTC m=+86.152521973" watchObservedRunningTime="2026-04-16 19:55:22.556266585 +0000 UTC m=+86.970841430" Apr 16 19:55:22.734820 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.734781 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-4dmst"] Apr 16 19:55:22.739127 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.739104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4dmst" Apr 16 19:55:22.741554 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.741536 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:55:22.741670 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.741552 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-xsrpg\"" Apr 16 19:55:22.741936 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.741921 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:55:22.751840 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.751812 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4dmst"] Apr 16 19:55:22.799500 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.799406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb42q\" (UniqueName: \"kubernetes.io/projected/a6209680-e76b-4b41-a4df-5d5f476a1df3-kube-api-access-lb42q\") pod \"downloads-6bcc868b7-4dmst\" (UID: \"a6209680-e76b-4b41-a4df-5d5f476a1df3\") " pod="openshift-console/downloads-6bcc868b7-4dmst" Apr 16 19:55:22.900129 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.900091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lb42q\" (UniqueName: \"kubernetes.io/projected/a6209680-e76b-4b41-a4df-5d5f476a1df3-kube-api-access-lb42q\") pod \"downloads-6bcc868b7-4dmst\" (UID: \"a6209680-e76b-4b41-a4df-5d5f476a1df3\") " pod="openshift-console/downloads-6bcc868b7-4dmst" Apr 16 19:55:22.908448 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:22.908409 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb42q\" (UniqueName: \"kubernetes.io/projected/a6209680-e76b-4b41-a4df-5d5f476a1df3-kube-api-access-lb42q\") pod \"downloads-6bcc868b7-4dmst\" (UID: \"a6209680-e76b-4b41-a4df-5d5f476a1df3\") " pod="openshift-console/downloads-6bcc868b7-4dmst" Apr 16 19:55:23.048971 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:23.048931 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4dmst" Apr 16 19:55:23.174628 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:23.174569 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4dmst"] Apr 16 19:55:23.177681 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:23.177647 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6209680_e76b_4b41_a4df_5d5f476a1df3.slice/crio-a1b7beafa51e673a17e5750ef648fe5f9ac450269308ee1b4f1d13ab55d53322 WatchSource:0}: Error finding container a1b7beafa51e673a17e5750ef648fe5f9ac450269308ee1b4f1d13ab55d53322: Status 404 returned error can't find the container with id a1b7beafa51e673a17e5750ef648fe5f9ac450269308ee1b4f1d13ab55d53322 Apr 16 19:55:23.662351 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:23.662312 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4dmst" event={"ID":"a6209680-e76b-4b41-a4df-5d5f476a1df3","Type":"ContainerStarted","Data":"a1b7beafa51e673a17e5750ef648fe5f9ac450269308ee1b4f1d13ab55d53322"} Apr 16 19:55:25.147026 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:25.146987 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67db9885f4-znz5z"] Apr 16 19:55:27.666058 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:27.666030 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-54f68c57f4-2vg75" Apr 16 19:55:32.497752 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.497706 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-dd5f9cd9b-p2f59"] Apr 16 19:55:32.502568 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.502542 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.505665 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.505609 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 19:55:32.505805 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.505665 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 19:55:32.507740 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.507709 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 19:55:32.507913 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.507886 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-t7pjz\"" Apr 16 19:55:32.508054 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.507884 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 19:55:32.512348 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.512179 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 19:55:32.514250 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.514206 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dd5f9cd9b-p2f59"] Apr 16 19:55:32.590831 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.590796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-serving-cert\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.591010 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.590854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-service-ca\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.591010 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.590932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-oauth-config\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.591010 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.590999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-oauth-serving-cert\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.591148 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.591049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srj4d\" (UniqueName: \"kubernetes.io/projected/479262d1-d0da-49ad-b1f4-9c56fe63709c-kube-api-access-srj4d\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.591148 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.591121 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-config\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.691708 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.691673 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-config\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.691883 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.691742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-serving-cert\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.691883 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.691794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-service-ca\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.691883 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.691823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-oauth-config\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.691883 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.691860 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-oauth-serving-cert\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.692091 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.691907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srj4d\" (UniqueName: \"kubernetes.io/projected/479262d1-d0da-49ad-b1f4-9c56fe63709c-kube-api-access-srj4d\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.692570 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.692499 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-config\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.692720 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.692605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-oauth-serving-cert\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.692720 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.692707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-service-ca\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.695018 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.694986 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-oauth-config\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.695191 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.695178 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-serving-cert\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.702336 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.702306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srj4d\" (UniqueName: \"kubernetes.io/projected/479262d1-d0da-49ad-b1f4-9c56fe63709c-kube-api-access-srj4d\") pod \"console-dd5f9cd9b-p2f59\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:32.818106 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:32.818012 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:38.605783 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:38.605756 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dd5f9cd9b-p2f59"] Apr 16 19:55:38.612752 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:38.612716 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod479262d1_d0da_49ad_b1f4_9c56fe63709c.slice/crio-41e6a90b7fe3642200badb037041e650ca6f25128496eb05c17c9fe14d57b2cb WatchSource:0}: Error finding container 41e6a90b7fe3642200badb037041e650ca6f25128496eb05c17c9fe14d57b2cb: Status 404 returned error can't find the container with id 41e6a90b7fe3642200badb037041e650ca6f25128496eb05c17c9fe14d57b2cb Apr 16 19:55:38.725175 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:38.725136 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd5f9cd9b-p2f59" event={"ID":"479262d1-d0da-49ad-b1f4-9c56fe63709c","Type":"ContainerStarted","Data":"41e6a90b7fe3642200badb037041e650ca6f25128496eb05c17c9fe14d57b2cb"} Apr 16 19:55:38.726588 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:38.726545 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4dmst" event={"ID":"a6209680-e76b-4b41-a4df-5d5f476a1df3","Type":"ContainerStarted","Data":"d2fc4e2b64b0fe728745e8fa34288870ef7d79d2a35f3c756c95e29bf02accca"} Apr 16 19:55:38.726771 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:38.726749 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-4dmst" Apr 16 19:55:38.728378 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:38.728351 2569 patch_prober.go:28] interesting pod/downloads-6bcc868b7-4dmst container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.23:8080/\": dial tcp 10.132.0.23:8080: connect: connection refused" start-of-body= Apr 16 19:55:38.728505 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:38.728405 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-4dmst" podUID="a6209680-e76b-4b41-a4df-5d5f476a1df3" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.23:8080/\": dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 19:55:38.743417 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:38.743306 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-4dmst" podStartSLOduration=1.348203041 podStartE2EDuration="16.743289841s" podCreationTimestamp="2026-04-16 19:55:22 +0000 UTC" firstStartedPulling="2026-04-16 19:55:23.179722736 +0000 UTC m=+87.594297565" lastFinishedPulling="2026-04-16 19:55:38.574809522 +0000 UTC m=+102.989384365" observedRunningTime="2026-04-16 19:55:38.741664534 +0000 UTC m=+103.156239384" watchObservedRunningTime="2026-04-16 19:55:38.743289841 +0000 UTC m=+103.157864753" Apr 16 19:55:39.745432 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:39.745368 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-4dmst" Apr 16 19:55:41.833998 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.833956 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69c875bd9d-zwv2w"] Apr 16 19:55:41.854138 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.854098 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69c875bd9d-zwv2w"] Apr 16 19:55:41.854330 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.854265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:41.861182 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.861155 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 19:55:41.987667 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.987619 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-serving-cert\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:41.987851 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.987685 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-config\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:41.987851 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.987799 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-oauth-config\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:41.987851 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.987843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-oauth-serving-cert\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:41.988007 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.987926 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-service-ca\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:41.988007 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.987948 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-trusted-ca-bundle\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:41.988103 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:41.988041 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mmcq\" (UniqueName: \"kubernetes.io/projected/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-kube-api-access-7mmcq\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.089112 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.089014 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-service-ca\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.089112 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.089070 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-trusted-ca-bundle\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.089360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.089116 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mmcq\" (UniqueName: \"kubernetes.io/projected/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-kube-api-access-7mmcq\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.089360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.089182 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-serving-cert\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.089360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.089216 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-config\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.089360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.089266 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-oauth-config\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.089360 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.089294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-oauth-serving-cert\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.090068 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.090040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-service-ca\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.090229 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.090091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-config\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.090322 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.090150 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-oauth-serving-cert\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.090322 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.090308 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-trusted-ca-bundle\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.092243 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.092192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-oauth-config\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.096605 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.092389 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-serving-cert\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.102101 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.102068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mmcq\" (UniqueName: \"kubernetes.io/projected/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-kube-api-access-7mmcq\") pod \"console-69c875bd9d-zwv2w\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.166340 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.166298 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:42.323313 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.323278 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69c875bd9d-zwv2w"] Apr 16 19:55:42.325747 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:55:42.325711 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd45a4281_5995_4b7a_aa2c_88f9d4490bbe.slice/crio-0f3bb8a6d9ff766331e57a8cf8d42afc80be492a2cb633517689dc5320329222 WatchSource:0}: Error finding container 0f3bb8a6d9ff766331e57a8cf8d42afc80be492a2cb633517689dc5320329222: Status 404 returned error can't find the container with id 0f3bb8a6d9ff766331e57a8cf8d42afc80be492a2cb633517689dc5320329222 Apr 16 19:55:42.744231 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.744108 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd5f9cd9b-p2f59" event={"ID":"479262d1-d0da-49ad-b1f4-9c56fe63709c","Type":"ContainerStarted","Data":"f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5"} Apr 16 19:55:42.745950 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.745910 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c875bd9d-zwv2w" event={"ID":"d45a4281-5995-4b7a-aa2c-88f9d4490bbe","Type":"ContainerStarted","Data":"ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16"} Apr 16 19:55:42.745950 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.745952 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c875bd9d-zwv2w" event={"ID":"d45a4281-5995-4b7a-aa2c-88f9d4490bbe","Type":"ContainerStarted","Data":"0f3bb8a6d9ff766331e57a8cf8d42afc80be492a2cb633517689dc5320329222"} Apr 16 19:55:42.759430 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.759372 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dd5f9cd9b-p2f59" podStartSLOduration=7.259329671 podStartE2EDuration="10.759353178s" podCreationTimestamp="2026-04-16 19:55:32 +0000 UTC" firstStartedPulling="2026-04-16 19:55:38.614981153 +0000 UTC m=+103.029555979" lastFinishedPulling="2026-04-16 19:55:42.115004647 +0000 UTC m=+106.529579486" observedRunningTime="2026-04-16 19:55:42.758864881 +0000 UTC m=+107.173439748" watchObservedRunningTime="2026-04-16 19:55:42.759353178 +0000 UTC m=+107.173928038" Apr 16 19:55:42.774839 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.774774 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69c875bd9d-zwv2w" podStartSLOduration=1.774752979 podStartE2EDuration="1.774752979s" podCreationTimestamp="2026-04-16 19:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:42.774024896 +0000 UTC m=+107.188599743" watchObservedRunningTime="2026-04-16 19:55:42.774752979 +0000 UTC m=+107.189327828" Apr 16 19:55:42.819171 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.819129 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:42.819171 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.819179 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:42.824744 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:42.824712 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:43.753780 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:43.753745 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:55:50.175497 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.175449 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" podUID="015a1a89-29e1-449f-b569-19b3cce360b4" containerName="registry" containerID="cri-o://f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93" gracePeriod=30 Apr 16 19:55:50.443713 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.443681 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:55:50.572310 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572267 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-bound-sa-token\") pod \"015a1a89-29e1-449f-b569-19b3cce360b4\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " Apr 16 19:55:50.572501 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572330 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/015a1a89-29e1-449f-b569-19b3cce360b4-ca-trust-extracted\") pod \"015a1a89-29e1-449f-b569-19b3cce360b4\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " Apr 16 19:55:50.572501 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572361 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-installation-pull-secrets\") pod \"015a1a89-29e1-449f-b569-19b3cce360b4\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " Apr 16 19:55:50.572501 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572397 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") pod \"015a1a89-29e1-449f-b569-19b3cce360b4\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " Apr 16 19:55:50.572501 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572466 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-registry-certificates\") pod \"015a1a89-29e1-449f-b569-19b3cce360b4\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " Apr 16 19:55:50.572501 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572493 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-trusted-ca\") pod \"015a1a89-29e1-449f-b569-19b3cce360b4\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " Apr 16 19:55:50.572771 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572553 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-image-registry-private-configuration\") pod \"015a1a89-29e1-449f-b569-19b3cce360b4\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " Apr 16 19:55:50.572952 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572844 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q5lq\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-kube-api-access-2q5lq\") pod \"015a1a89-29e1-449f-b569-19b3cce360b4\" (UID: \"015a1a89-29e1-449f-b569-19b3cce360b4\") " Apr 16 19:55:50.573040 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.572991 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "015a1a89-29e1-449f-b569-19b3cce360b4" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:50.573292 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.573271 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-registry-certificates\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:55:50.573381 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.573311 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "015a1a89-29e1-449f-b569-19b3cce360b4" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:50.575396 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.575295 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "015a1a89-29e1-449f-b569-19b3cce360b4" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:50.575396 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.575348 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-kube-api-access-2q5lq" (OuterVolumeSpecName: "kube-api-access-2q5lq") pod "015a1a89-29e1-449f-b569-19b3cce360b4" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4"). InnerVolumeSpecName "kube-api-access-2q5lq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:50.575568 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.575448 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "015a1a89-29e1-449f-b569-19b3cce360b4" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:50.575568 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.575519 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "015a1a89-29e1-449f-b569-19b3cce360b4" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:50.575568 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.575557 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "015a1a89-29e1-449f-b569-19b3cce360b4" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:50.584129 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.584090 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015a1a89-29e1-449f-b569-19b3cce360b4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "015a1a89-29e1-449f-b569-19b3cce360b4" (UID: "015a1a89-29e1-449f-b569-19b3cce360b4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:55:50.674529 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.674489 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/015a1a89-29e1-449f-b569-19b3cce360b4-trusted-ca\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:55:50.674529 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.674522 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-image-registry-private-configuration\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:55:50.674529 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.674534 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2q5lq\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-kube-api-access-2q5lq\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:55:50.674786 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.674544 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-bound-sa-token\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:55:50.674786 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.674552 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/015a1a89-29e1-449f-b569-19b3cce360b4-ca-trust-extracted\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:55:50.674786 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.674562 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/015a1a89-29e1-449f-b569-19b3cce360b4-installation-pull-secrets\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:55:50.674786 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.674570 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/015a1a89-29e1-449f-b569-19b3cce360b4-registry-tls\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:55:50.772903 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.772820 2569 generic.go:358] "Generic (PLEG): container finished" podID="015a1a89-29e1-449f-b569-19b3cce360b4" containerID="f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93" exitCode=0 Apr 16 19:55:50.772903 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.772872 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" event={"ID":"015a1a89-29e1-449f-b569-19b3cce360b4","Type":"ContainerDied","Data":"f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93"} Apr 16 19:55:50.772903 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.772884 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" Apr 16 19:55:50.772903 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.772895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67db9885f4-znz5z" event={"ID":"015a1a89-29e1-449f-b569-19b3cce360b4","Type":"ContainerDied","Data":"873be18731a2cf480638b7645ab090d3b6f0e9ec0b03c9ae131d0e4e573b6d12"} Apr 16 19:55:50.773178 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.772910 2569 scope.go:117] "RemoveContainer" containerID="f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93" Apr 16 19:55:50.786243 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.786216 2569 scope.go:117] "RemoveContainer" containerID="f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93" Apr 16 19:55:50.786629 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:55:50.786570 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93\": container with ID starting with f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93 not found: ID does not exist" containerID="f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93" Apr 16 19:55:50.786727 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.786633 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93"} err="failed to get container status \"f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93\": rpc error: code = NotFound desc = could not find container \"f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93\": container with ID starting with f5734ffac9a380b090584464960e355ca079014b2a60c3fe183ff356452e4a93 not found: ID does not exist" Apr 16 19:55:50.803543 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.803505 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67db9885f4-znz5z"] Apr 16 19:55:50.807016 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:50.806989 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67db9885f4-znz5z"] Apr 16 19:55:51.777546 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:51.777511 2569 generic.go:358] "Generic (PLEG): container finished" podID="95f11675-707c-4777-8e40-73a4b72aadc9" containerID="52d26845b2c6698ef8a034b324678635b24acb4d1e174d5771ede82663831b91" exitCode=0 Apr 16 19:55:51.778026 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:51.777589 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" event={"ID":"95f11675-707c-4777-8e40-73a4b72aadc9","Type":"ContainerDied","Data":"52d26845b2c6698ef8a034b324678635b24acb4d1e174d5771ede82663831b91"} Apr 16 19:55:51.778026 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:51.777971 2569 scope.go:117] "RemoveContainer" containerID="52d26845b2c6698ef8a034b324678635b24acb4d1e174d5771ede82663831b91" Apr 16 19:55:52.166833 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.166797 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:52.167032 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.166850 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:52.172207 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.172176 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:52.185087 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.185054 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015a1a89-29e1-449f-b569-19b3cce360b4" path="/var/lib/kubelet/pods/015a1a89-29e1-449f-b569-19b3cce360b4/volumes" Apr 16 19:55:52.783076 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.783039 2569 generic.go:358] "Generic (PLEG): container finished" podID="c7b46a8f-9a8f-42e0-971b-334f467cc56f" containerID="1940bc45ed6c6eeb5f55afca3036ba236faaa640cd34e7b6fb9fca3f9e8b69c0" exitCode=0 Apr 16 19:55:52.783560 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.783119 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k4s7x" event={"ID":"c7b46a8f-9a8f-42e0-971b-334f467cc56f","Type":"ContainerDied","Data":"1940bc45ed6c6eeb5f55afca3036ba236faaa640cd34e7b6fb9fca3f9e8b69c0"} Apr 16 19:55:52.783666 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.783599 2569 scope.go:117] "RemoveContainer" containerID="1940bc45ed6c6eeb5f55afca3036ba236faaa640cd34e7b6fb9fca3f9e8b69c0" Apr 16 19:55:52.784842 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.784809 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vh7nb" event={"ID":"95f11675-707c-4777-8e40-73a4b72aadc9","Type":"ContainerStarted","Data":"ee8d3cc72c05ae0f026ac2d653e8a227bbbfd745ff8cc7f8daba673a67bb7b58"} Apr 16 19:55:52.788777 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.788743 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:55:52.861515 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:52.861483 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dd5f9cd9b-p2f59"] Apr 16 19:55:53.789381 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:53.789346 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k4s7x" event={"ID":"c7b46a8f-9a8f-42e0-971b-334f467cc56f","Type":"ContainerStarted","Data":"875ea6a1d74ffba61c633b84d644662266082c8c3676be6a66fc7a585422483d"} Apr 16 19:55:57.803727 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:57.803684 2569 generic.go:358] "Generic (PLEG): container finished" podID="9a0704cd-b28c-4d5b-9e72-79fcd84527b4" containerID="5cdfbeee74934202e9e58d848a97889833bee873fb1842cafb235eb0001e42da" exitCode=0 Apr 16 19:55:57.804126 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:57.803758 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" event={"ID":"9a0704cd-b28c-4d5b-9e72-79fcd84527b4","Type":"ContainerDied","Data":"5cdfbeee74934202e9e58d848a97889833bee873fb1842cafb235eb0001e42da"} Apr 16 19:55:57.804126 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:57.804084 2569 scope.go:117] "RemoveContainer" containerID="5cdfbeee74934202e9e58d848a97889833bee873fb1842cafb235eb0001e42da" Apr 16 19:55:58.808244 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:55:58.808205 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jxvqq" event={"ID":"9a0704cd-b28c-4d5b-9e72-79fcd84527b4","Type":"ContainerStarted","Data":"cbf74c141cf7021db947bd12aeb97856dc2ff62dd5a4b1a9df6117b4bc02463a"} Apr 16 19:56:17.882811 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:17.882769 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-dd5f9cd9b-p2f59" podUID="479262d1-d0da-49ad-b1f4-9c56fe63709c" containerName="console" containerID="cri-o://f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5" gracePeriod=15 Apr 16 19:56:18.159112 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.159087 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dd5f9cd9b-p2f59_479262d1-d0da-49ad-b1f4-9c56fe63709c/console/0.log" Apr 16 19:56:18.159252 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.159153 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:56:18.220562 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.220522 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-serving-cert\") pod \"479262d1-d0da-49ad-b1f4-9c56fe63709c\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " Apr 16 19:56:18.220757 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.220594 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-oauth-serving-cert\") pod \"479262d1-d0da-49ad-b1f4-9c56fe63709c\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " Apr 16 19:56:18.220757 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.220637 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-service-ca\") pod \"479262d1-d0da-49ad-b1f4-9c56fe63709c\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " Apr 16 19:56:18.220757 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.220662 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-config\") pod \"479262d1-d0da-49ad-b1f4-9c56fe63709c\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " Apr 16 19:56:18.220757 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.220713 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srj4d\" (UniqueName: \"kubernetes.io/projected/479262d1-d0da-49ad-b1f4-9c56fe63709c-kube-api-access-srj4d\") pod \"479262d1-d0da-49ad-b1f4-9c56fe63709c\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " Apr 16 19:56:18.220757 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.220751 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-oauth-config\") pod \"479262d1-d0da-49ad-b1f4-9c56fe63709c\" (UID: \"479262d1-d0da-49ad-b1f4-9c56fe63709c\") " Apr 16 19:56:18.221080 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.221047 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "479262d1-d0da-49ad-b1f4-9c56fe63709c" (UID: "479262d1-d0da-49ad-b1f4-9c56fe63709c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:18.221216 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.221074 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-service-ca" (OuterVolumeSpecName: "service-ca") pod "479262d1-d0da-49ad-b1f4-9c56fe63709c" (UID: "479262d1-d0da-49ad-b1f4-9c56fe63709c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:18.221216 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.221083 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-config" (OuterVolumeSpecName: "console-config") pod "479262d1-d0da-49ad-b1f4-9c56fe63709c" (UID: "479262d1-d0da-49ad-b1f4-9c56fe63709c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:18.222968 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.222938 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479262d1-d0da-49ad-b1f4-9c56fe63709c-kube-api-access-srj4d" (OuterVolumeSpecName: "kube-api-access-srj4d") pod "479262d1-d0da-49ad-b1f4-9c56fe63709c" (UID: "479262d1-d0da-49ad-b1f4-9c56fe63709c"). InnerVolumeSpecName "kube-api-access-srj4d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:18.223057 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.222953 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "479262d1-d0da-49ad-b1f4-9c56fe63709c" (UID: "479262d1-d0da-49ad-b1f4-9c56fe63709c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:18.223057 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.223039 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "479262d1-d0da-49ad-b1f4-9c56fe63709c" (UID: "479262d1-d0da-49ad-b1f4-9c56fe63709c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:18.321817 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.321776 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-srj4d\" (UniqueName: \"kubernetes.io/projected/479262d1-d0da-49ad-b1f4-9c56fe63709c-kube-api-access-srj4d\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:18.321817 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.321813 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-oauth-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:18.322027 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.321829 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-serving-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:18.322027 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.321844 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-oauth-serving-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:18.322027 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.321856 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-service-ca\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:18.322027 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.321868 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/479262d1-d0da-49ad-b1f4-9c56fe63709c-console-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:18.875378 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.875348 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dd5f9cd9b-p2f59_479262d1-d0da-49ad-b1f4-9c56fe63709c/console/0.log" Apr 16 19:56:18.875557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.875393 2569 generic.go:358] "Generic (PLEG): container finished" podID="479262d1-d0da-49ad-b1f4-9c56fe63709c" containerID="f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5" exitCode=2 Apr 16 19:56:18.875557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.875460 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd5f9cd9b-p2f59" Apr 16 19:56:18.875557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.875489 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd5f9cd9b-p2f59" event={"ID":"479262d1-d0da-49ad-b1f4-9c56fe63709c","Type":"ContainerDied","Data":"f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5"} Apr 16 19:56:18.875740 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.875559 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd5f9cd9b-p2f59" event={"ID":"479262d1-d0da-49ad-b1f4-9c56fe63709c","Type":"ContainerDied","Data":"41e6a90b7fe3642200badb037041e650ca6f25128496eb05c17c9fe14d57b2cb"} Apr 16 19:56:18.875740 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.875593 2569 scope.go:117] "RemoveContainer" containerID="f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5" Apr 16 19:56:18.884722 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.884533 2569 scope.go:117] "RemoveContainer" containerID="f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5" Apr 16 19:56:18.884956 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:56:18.884843 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5\": container with ID starting with f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5 not found: ID does not exist" containerID="f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5" Apr 16 19:56:18.884956 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.884865 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5"} err="failed to get container status \"f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5\": rpc error: code = NotFound desc = could not find container \"f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5\": container with ID starting with f5755746dcf48242d50aecb53b5f20b5001c7f29df13d9036a598ea4e67e2dc5 not found: ID does not exist" Apr 16 19:56:18.896224 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.896195 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dd5f9cd9b-p2f59"] Apr 16 19:56:18.899708 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:18.899680 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dd5f9cd9b-p2f59"] Apr 16 19:56:20.187795 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:20.184710 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479262d1-d0da-49ad-b1f4-9c56fe63709c" path="/var/lib/kubelet/pods/479262d1-d0da-49ad-b1f4-9c56fe63709c/volumes" Apr 16 19:56:34.950076 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:34.950041 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:34.950611 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:34.950541 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="alertmanager" containerID="cri-o://1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24" gracePeriod=120 Apr 16 19:56:34.950711 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:34.950633 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="config-reloader" containerID="cri-o://18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f" gracePeriod=120 Apr 16 19:56:34.950768 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:34.950687 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy" containerID="cri-o://864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b" gracePeriod=120 Apr 16 19:56:34.950823 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:34.950561 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="prom-label-proxy" containerID="cri-o://e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d" gracePeriod=120 Apr 16 19:56:34.950823 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:34.950635 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy-metric" containerID="cri-o://74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96" gracePeriod=120 Apr 16 19:56:34.950918 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:34.950633 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy-web" containerID="cri-o://3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02" gracePeriod=120 Apr 16 19:56:35.937960 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:35.937925 2569 generic.go:358] "Generic (PLEG): container finished" podID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerID="e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d" exitCode=0 Apr 16 19:56:35.937960 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:35.937952 2569 generic.go:358] "Generic (PLEG): container finished" podID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerID="864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b" exitCode=0 Apr 16 19:56:35.937960 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:35.937959 2569 generic.go:358] "Generic (PLEG): container finished" podID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerID="18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f" exitCode=0 Apr 16 19:56:35.937960 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:35.937965 2569 generic.go:358] "Generic (PLEG): container finished" podID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerID="1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24" exitCode=0 Apr 16 19:56:35.938267 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:35.938003 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerDied","Data":"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d"} Apr 16 19:56:35.938267 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:35.938048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerDied","Data":"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b"} Apr 16 19:56:35.938267 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:35.938062 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerDied","Data":"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f"} Apr 16 19:56:35.938267 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:35.938074 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerDied","Data":"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24"} Apr 16 19:56:36.201723 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.201653 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:36.373755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.373722 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtjhd\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-kube-api-access-jtjhd\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.373928 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.373768 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-metrics-client-ca\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.373928 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.373788 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-cluster-tls-config\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.373928 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.373827 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-tls-assets\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.373928 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.373864 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-trusted-ca-bundle\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.373928 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.373891 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.374192 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.373937 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.374192 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.373987 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-out\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.374301 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374214 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:36.374382 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374357 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-main-tls\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.374437 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374382 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:36.374437 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374409 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-volume\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.374537 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374444 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-main-db\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.374537 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374476 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-web-config\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.374537 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374504 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-web\") pod \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\" (UID: \"2199094c-50c5-4e79-af74-8d28fb1e8b9d\") " Apr 16 19:56:36.374823 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374801 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-metrics-client-ca\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.374911 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.374832 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.376860 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.376816 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:36.376860 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.376822 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:36.377022 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.376928 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-out" (OuterVolumeSpecName: "config-out") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:36.377277 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.377206 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:36.377611 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.377504 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:36.377611 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.377519 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-volume" (OuterVolumeSpecName: "config-volume") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:36.377611 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.377540 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:36.377831 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.377604 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-kube-api-access-jtjhd" (OuterVolumeSpecName: "kube-api-access-jtjhd") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "kube-api-access-jtjhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:36.378832 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.378802 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:36.381411 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.381379 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:36.390274 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.390236 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-web-config" (OuterVolumeSpecName: "web-config") pod "2199094c-50c5-4e79-af74-8d28fb1e8b9d" (UID: "2199094c-50c5-4e79-af74-8d28fb1e8b9d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:36.475812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475726 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-tls-assets\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.475812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475757 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.475812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475768 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.475812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475778 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-out\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.475812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475787 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-main-tls\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.475812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475796 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-config-volume\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.475812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475806 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2199094c-50c5-4e79-af74-8d28fb1e8b9d-alertmanager-main-db\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.475812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475814 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-web-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.476142 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475825 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.476142 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475834 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jtjhd\" (UniqueName: \"kubernetes.io/projected/2199094c-50c5-4e79-af74-8d28fb1e8b9d-kube-api-access-jtjhd\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.476142 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.475843 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2199094c-50c5-4e79-af74-8d28fb1e8b9d-cluster-tls-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:56:36.943755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.943721 2569 generic.go:358] "Generic (PLEG): container finished" podID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerID="74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96" exitCode=0 Apr 16 19:56:36.943755 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.943750 2569 generic.go:358] "Generic (PLEG): container finished" podID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerID="3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02" exitCode=0 Apr 16 19:56:36.943955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.943814 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerDied","Data":"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96"} Apr 16 19:56:36.943955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.943827 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:36.943955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.943854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerDied","Data":"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02"} Apr 16 19:56:36.943955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.943870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2199094c-50c5-4e79-af74-8d28fb1e8b9d","Type":"ContainerDied","Data":"4d03e8f65f5dfc55e9c5c18dccc66959dff09454c3dff2df0aeb07e568cbb5ed"} Apr 16 19:56:36.943955 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.943885 2569 scope.go:117] "RemoveContainer" containerID="e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d" Apr 16 19:56:36.952540 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.952515 2569 scope.go:117] "RemoveContainer" containerID="74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96" Apr 16 19:56:36.959828 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.959804 2569 scope.go:117] "RemoveContainer" containerID="864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b" Apr 16 19:56:36.966505 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.966476 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:36.968595 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.968548 2569 scope.go:117] "RemoveContainer" containerID="3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02" Apr 16 19:56:36.969862 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.969838 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:36.976124 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.976104 2569 scope.go:117] "RemoveContainer" containerID="18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f" Apr 16 19:56:36.983878 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.983850 2569 scope.go:117] "RemoveContainer" containerID="1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24" Apr 16 19:56:36.991557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.991528 2569 scope.go:117] "RemoveContainer" containerID="539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1" Apr 16 19:56:36.993038 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993012 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:36.993369 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993354 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy-metric" Apr 16 19:56:36.993441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993381 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy-metric" Apr 16 19:56:36.993441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993391 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="prom-label-proxy" Apr 16 19:56:36.993441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993396 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="prom-label-proxy" Apr 16 19:56:36.993441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993412 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="015a1a89-29e1-449f-b569-19b3cce360b4" containerName="registry" Apr 16 19:56:36.993441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993420 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="015a1a89-29e1-449f-b569-19b3cce360b4" containerName="registry" Apr 16 19:56:36.993441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993430 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="alertmanager" Apr 16 19:56:36.993441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993435 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="alertmanager" Apr 16 19:56:36.993441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993441 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993446 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993460 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="init-config-reloader" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993465 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="init-config-reloader" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993470 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="config-reloader" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993476 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="config-reloader" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993488 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="479262d1-d0da-49ad-b1f4-9c56fe63709c" containerName="console" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993495 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="479262d1-d0da-49ad-b1f4-9c56fe63709c" containerName="console" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993502 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy-web" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993507 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy-web" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993557 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="015a1a89-29e1-449f-b569-19b3cce360b4" containerName="registry" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993568 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy-metric" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993593 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="prom-label-proxy" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993601 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="alertmanager" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993609 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy-web" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993615 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="kube-rbac-proxy" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993622 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" containerName="config-reloader" Apr 16 19:56:36.993778 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.993628 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="479262d1-d0da-49ad-b1f4-9c56fe63709c" containerName="console" Apr 16 19:56:36.998833 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.998805 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:36.999264 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.999218 2569 scope.go:117] "RemoveContainer" containerID="e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d" Apr 16 19:56:36.999586 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:56:36.999551 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d\": container with ID starting with e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d not found: ID does not exist" containerID="e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d" Apr 16 19:56:36.999648 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.999602 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d"} err="failed to get container status \"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d\": rpc error: code = NotFound desc = could not find container \"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d\": container with ID starting with e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d not found: ID does not exist" Apr 16 19:56:36.999648 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.999627 2569 scope.go:117] "RemoveContainer" containerID="74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96" Apr 16 19:56:36.999968 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:56:36.999948 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96\": container with ID starting with 74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96 not found: ID does not exist" containerID="74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96" Apr 16 19:56:37.000012 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.999975 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96"} err="failed to get container status \"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96\": rpc error: code = NotFound desc = could not find container \"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96\": container with ID starting with 74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96 not found: ID does not exist" Apr 16 19:56:37.000012 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:36.999993 2569 scope.go:117] "RemoveContainer" containerID="864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b" Apr 16 19:56:37.000245 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:56:37.000223 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b\": container with ID starting with 864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b not found: ID does not exist" containerID="864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b" Apr 16 19:56:37.000337 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.000251 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b"} err="failed to get container status \"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b\": rpc error: code = NotFound desc = could not find container \"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b\": container with ID starting with 864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b not found: ID does not exist" Apr 16 19:56:37.000337 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.000271 2569 scope.go:117] "RemoveContainer" containerID="3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02" Apr 16 19:56:37.000540 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:56:37.000524 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02\": container with ID starting with 3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02 not found: ID does not exist" containerID="3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02" Apr 16 19:56:37.000620 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.000542 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02"} err="failed to get container status \"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02\": rpc error: code = NotFound desc = could not find container \"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02\": container with ID starting with 3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02 not found: ID does not exist" Apr 16 19:56:37.000620 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.000555 2569 scope.go:117] "RemoveContainer" containerID="18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f" Apr 16 19:56:37.000821 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:56:37.000804 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f\": container with ID starting with 18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f not found: ID does not exist" containerID="18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f" Apr 16 19:56:37.000868 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.000825 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f"} err="failed to get container status \"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f\": rpc error: code = NotFound desc = could not find container \"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f\": container with ID starting with 18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f not found: ID does not exist" Apr 16 19:56:37.000868 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.000838 2569 scope.go:117] "RemoveContainer" containerID="1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24" Apr 16 19:56:37.001108 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001088 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:56:37.001108 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:56:37.001097 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24\": container with ID starting with 1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24 not found: ID does not exist" containerID="1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24" Apr 16 19:56:37.001265 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001134 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:56:37.001265 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001182 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:56:37.001265 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001186 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:56:37.001265 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001127 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24"} err="failed to get container status \"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24\": rpc error: code = NotFound desc = could not find container \"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24\": container with ID starting with 1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24 not found: ID does not exist" Apr 16 19:56:37.001265 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001235 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:56:37.001265 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001249 2569 scope.go:117] "RemoveContainer" containerID="539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1" Apr 16 19:56:37.001265 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001134 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:56:37.001650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001270 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4ls6g\"" Apr 16 19:56:37.001650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001261 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:56:37.001650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001376 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:56:37.001650 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:56:37.001604 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1\": container with ID starting with 539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1 not found: ID does not exist" containerID="539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1" Apr 16 19:56:37.001650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001623 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1"} err="failed to get container status \"539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1\": rpc error: code = NotFound desc = could not find container \"539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1\": container with ID starting with 539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1 not found: ID does not exist" Apr 16 19:56:37.001650 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001638 2569 scope.go:117] "RemoveContainer" containerID="e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d" Apr 16 19:56:37.001928 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001892 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d"} err="failed to get container status \"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d\": rpc error: code = NotFound desc = could not find container \"e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d\": container with ID starting with e3f3263612549005b5c5982d6d074d9658e3fb5434c54414e6732077ff545f3d not found: ID does not exist" Apr 16 19:56:37.001928 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.001909 2569 scope.go:117] "RemoveContainer" containerID="74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96" Apr 16 19:56:37.002169 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.002147 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96"} err="failed to get container status \"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96\": rpc error: code = NotFound desc = could not find container \"74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96\": container with ID starting with 74b3892482210dea89d41b58cfc26ece81e809882f9cf150f625b45daa868f96 not found: ID does not exist" Apr 16 19:56:37.002242 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.002171 2569 scope.go:117] "RemoveContainer" containerID="864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b" Apr 16 19:56:37.002443 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.002419 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b"} err="failed to get container status \"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b\": rpc error: code = NotFound desc = could not find container \"864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b\": container with ID starting with 864ab27ce6c9c405c4a635ba8188a4bdb1258524219f66cae0a3dbc3f8466c2b not found: ID does not exist" Apr 16 19:56:37.002498 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.002444 2569 scope.go:117] "RemoveContainer" containerID="3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02" Apr 16 19:56:37.002841 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.002807 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02"} err="failed to get container status \"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02\": rpc error: code = NotFound desc = could not find container \"3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02\": container with ID starting with 3356f39cf90c31d560394860672743a2cc1f936b1e837a2cf854838d6d358b02 not found: ID does not exist" Apr 16 19:56:37.002841 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.002840 2569 scope.go:117] "RemoveContainer" containerID="18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f" Apr 16 19:56:37.003109 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.003091 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f"} err="failed to get container status \"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f\": rpc error: code = NotFound desc = could not find container \"18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f\": container with ID starting with 18e539f638a6c0db7bd6b4eb443edd4af241e73bbaca79fe7d47b0c0e2c7e49f not found: ID does not exist" Apr 16 19:56:37.003164 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.003109 2569 scope.go:117] "RemoveContainer" containerID="1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24" Apr 16 19:56:37.003354 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.003335 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24"} err="failed to get container status \"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24\": rpc error: code = NotFound desc = could not find container \"1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24\": container with ID starting with 1ef9dd2cb8e49ab20e68900f3c094449d2c8cdf2a0846bbd5d24021a99348e24 not found: ID does not exist" Apr 16 19:56:37.003407 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.003356 2569 scope.go:117] "RemoveContainer" containerID="539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1" Apr 16 19:56:37.003632 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.003609 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1"} err="failed to get container status \"539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1\": rpc error: code = NotFound desc = could not find container \"539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1\": container with ID starting with 539ee2a671e9ab067cf401213bd5f07afd47c6302b5699fd3dc9d7b63f665ee1 not found: ID does not exist" Apr 16 19:56:37.007159 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.006919 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:56:37.007633 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.007614 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:37.080034 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.079990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bf94de88-5e28-47c2-b79a-0d38938b1c5c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080209 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080209 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf94de88-5e28-47c2-b79a-0d38938b1c5c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080209 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080134 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf94de88-5e28-47c2-b79a-0d38938b1c5c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080209 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080165 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080209 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-config-volume\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080369 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080274 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf94de88-5e28-47c2-b79a-0d38938b1c5c-config-out\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080369 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080317 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080369 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080343 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-web-config\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080458 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080458 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080390 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf94de88-5e28-47c2-b79a-0d38938b1c5c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080458 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080421 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njs66\" (UniqueName: \"kubernetes.io/projected/bf94de88-5e28-47c2-b79a-0d38938b1c5c-kube-api-access-njs66\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.080458 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.080454 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181485 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181434 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-web-config\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181485 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181488 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181517 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf94de88-5e28-47c2-b79a-0d38938b1c5c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njs66\" (UniqueName: \"kubernetes.io/projected/bf94de88-5e28-47c2-b79a-0d38938b1c5c-kube-api-access-njs66\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181627 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bf94de88-5e28-47c2-b79a-0d38938b1c5c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181668 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf94de88-5e28-47c2-b79a-0d38938b1c5c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181693 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf94de88-5e28-47c2-b79a-0d38938b1c5c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.181731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.182168 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-config-volume\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.182168 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181786 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf94de88-5e28-47c2-b79a-0d38938b1c5c-config-out\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.182168 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.181827 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.182314 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.182238 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bf94de88-5e28-47c2-b79a-0d38938b1c5c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.182645 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.182618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf94de88-5e28-47c2-b79a-0d38938b1c5c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.183458 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.183391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf94de88-5e28-47c2-b79a-0d38938b1c5c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.184744 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.184720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.185324 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.185278 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-web-config\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.185424 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.185375 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.185424 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.185406 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-config-volume\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.185599 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.185554 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.185737 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.185715 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf94de88-5e28-47c2-b79a-0d38938b1c5c-config-out\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.185798 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.185771 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf94de88-5e28-47c2-b79a-0d38938b1c5c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.185856 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.185841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.186422 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.186402 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bf94de88-5e28-47c2-b79a-0d38938b1c5c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.189648 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.189624 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njs66\" (UniqueName: \"kubernetes.io/projected/bf94de88-5e28-47c2-b79a-0d38938b1c5c-kube-api-access-njs66\") pod \"alertmanager-main-0\" (UID: \"bf94de88-5e28-47c2-b79a-0d38938b1c5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.311498 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.311405 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:37.442121 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.442096 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:37.444302 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:56:37.444270 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf94de88_5e28_47c2_b79a_0d38938b1c5c.slice/crio-d40ce6b0beaff1c07a247133e7235f5b86b3635b78f51a616bfb06df5c9c79a5 WatchSource:0}: Error finding container d40ce6b0beaff1c07a247133e7235f5b86b3635b78f51a616bfb06df5c9c79a5: Status 404 returned error can't find the container with id d40ce6b0beaff1c07a247133e7235f5b86b3635b78f51a616bfb06df5c9c79a5 Apr 16 19:56:37.951629 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.951590 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf94de88-5e28-47c2-b79a-0d38938b1c5c" containerID="7b9162db14b5f92dd4bed340d9c03f417e9a792992930975c4b8abe11aa159a7" exitCode=0 Apr 16 19:56:37.951803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.951672 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf94de88-5e28-47c2-b79a-0d38938b1c5c","Type":"ContainerDied","Data":"7b9162db14b5f92dd4bed340d9c03f417e9a792992930975c4b8abe11aa159a7"} Apr 16 19:56:37.951803 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:37.951706 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf94de88-5e28-47c2-b79a-0d38938b1c5c","Type":"ContainerStarted","Data":"d40ce6b0beaff1c07a247133e7235f5b86b3635b78f51a616bfb06df5c9c79a5"} Apr 16 19:56:38.184030 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.184002 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2199094c-50c5-4e79-af74-8d28fb1e8b9d" path="/var/lib/kubelet/pods/2199094c-50c5-4e79-af74-8d28fb1e8b9d/volumes" Apr 16 19:56:38.958917 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.958880 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf94de88-5e28-47c2-b79a-0d38938b1c5c","Type":"ContainerStarted","Data":"fbec20dbf73de07e90362fec12d6c2f4af920dc688e564f3211b1ef4a226fa50"} Apr 16 19:56:38.958917 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.958923 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf94de88-5e28-47c2-b79a-0d38938b1c5c","Type":"ContainerStarted","Data":"f3d70bcb357e1bc3674e464c240d937143b4bb50765516c295253ecbfb88f79f"} Apr 16 19:56:38.959470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.958934 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf94de88-5e28-47c2-b79a-0d38938b1c5c","Type":"ContainerStarted","Data":"7de47f199cb654070fbe43802f2ba59d1a89f46d16962746347d4ae09b32e0e0"} Apr 16 19:56:38.959470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.958944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf94de88-5e28-47c2-b79a-0d38938b1c5c","Type":"ContainerStarted","Data":"1b73ce852b1650496bdb7562ef4f00b0fd63d29fb0a451d3204a1cc406788555"} Apr 16 19:56:38.959470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.958953 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf94de88-5e28-47c2-b79a-0d38938b1c5c","Type":"ContainerStarted","Data":"8aa49eeb560550fad55093df1409b2641dea969212c9b2c6f7954494128c80f0"} Apr 16 19:56:38.959470 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.958962 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bf94de88-5e28-47c2-b79a-0d38938b1c5c","Type":"ContainerStarted","Data":"8749d3ffca20f9ddcd6800eb47bbd65942229e338fc1adc5ab156590fdb72a06"} Apr 16 19:56:38.965473 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.965439 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7b9785d66f-s2bwr"] Apr 16 19:56:38.969170 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.969145 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:38.971167 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.971140 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 19:56:38.971735 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.971718 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 19:56:38.971812 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.971762 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 19:56:38.971866 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.971852 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 19:56:38.972112 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.972096 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 19:56:38.972376 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.972357 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-wwsws\"" Apr 16 19:56:38.979372 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.979343 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 19:56:38.979642 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.979620 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7b9785d66f-s2bwr"] Apr 16 19:56:38.992304 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:38.992247 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.992229199 podStartE2EDuration="2.992229199s" podCreationTimestamp="2026-04-16 19:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:38.990757179 +0000 UTC m=+163.405332051" watchObservedRunningTime="2026-04-16 19:56:38.992229199 +0000 UTC m=+163.406804048" Apr 16 19:56:39.105948 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.105897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.106161 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.105984 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.106161 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.106047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-secret-telemeter-client\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.106161 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.106098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-serving-certs-ca-bundle\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.106161 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.106127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-metrics-client-ca\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.106355 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.106200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-telemeter-client-tls\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.106355 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.106229 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-federate-client-tls\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.106441 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.106419 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd7cr\" (UniqueName: \"kubernetes.io/projected/440343c1-f829-47fe-9627-0e58df180985-kube-api-access-pd7cr\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.207120 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.207085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.207215 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.207138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.207215 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.207169 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-secret-telemeter-client\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.207215 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.207191 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-serving-certs-ca-bundle\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.207362 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.207213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-metrics-client-ca\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.207362 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.207250 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-telemeter-client-tls\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.207362 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.207268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-federate-client-tls\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.207362 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.207312 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd7cr\" (UniqueName: \"kubernetes.io/projected/440343c1-f829-47fe-9627-0e58df180985-kube-api-access-pd7cr\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.208084 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.208048 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-metrics-client-ca\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.208218 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.208169 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-serving-certs-ca-bundle\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.208286 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.208242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440343c1-f829-47fe-9627-0e58df180985-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.210203 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.210144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-federate-client-tls\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.210300 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.210253 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-telemeter-client-tls\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.210300 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.210275 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-secret-telemeter-client\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.210373 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.210331 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/440343c1-f829-47fe-9627-0e58df180985-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.214425 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.214402 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd7cr\" (UniqueName: \"kubernetes.io/projected/440343c1-f829-47fe-9627-0e58df180985-kube-api-access-pd7cr\") pod \"telemeter-client-7b9785d66f-s2bwr\" (UID: \"440343c1-f829-47fe-9627-0e58df180985\") " pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.281829 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.281797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" Apr 16 19:56:39.411612 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.411479 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7b9785d66f-s2bwr"] Apr 16 19:56:39.418345 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:56:39.418314 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440343c1_f829_47fe_9627_0e58df180985.slice/crio-f110a43b4edd61decb7b986767ce9524e557058d5944846c0bdf4ccff761ffff WatchSource:0}: Error finding container f110a43b4edd61decb7b986767ce9524e557058d5944846c0bdf4ccff761ffff: Status 404 returned error can't find the container with id f110a43b4edd61decb7b986767ce9524e557058d5944846c0bdf4ccff761ffff Apr 16 19:56:39.964211 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:39.964175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" event={"ID":"440343c1-f829-47fe-9627-0e58df180985","Type":"ContainerStarted","Data":"f110a43b4edd61decb7b986767ce9524e557058d5944846c0bdf4ccff761ffff"} Apr 16 19:56:41.973820 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:41.973782 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" event={"ID":"440343c1-f829-47fe-9627-0e58df180985","Type":"ContainerStarted","Data":"7fe6916bff6baefa0f77ac8e4963b26b612513b5aebf615bd985f48f9bd623cb"} Apr 16 19:56:41.973820 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:41.973821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" event={"ID":"440343c1-f829-47fe-9627-0e58df180985","Type":"ContainerStarted","Data":"145176053b7dd05d28cbaa1caf9b850b2370d64b62978dc3d43f0ae49f3ff732"} Apr 16 19:56:41.974296 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:41.973831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" event={"ID":"440343c1-f829-47fe-9627-0e58df180985","Type":"ContainerStarted","Data":"f578aa5b5a852e1125c292e5eaa1f4b9c06596ac8754ca8fc24fb1e080f139db"} Apr 16 19:56:42.007506 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.007439 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7b9785d66f-s2bwr" podStartSLOduration=2.326602402 podStartE2EDuration="4.0074252s" podCreationTimestamp="2026-04-16 19:56:38 +0000 UTC" firstStartedPulling="2026-04-16 19:56:39.420295313 +0000 UTC m=+163.834870139" lastFinishedPulling="2026-04-16 19:56:41.101118107 +0000 UTC m=+165.515692937" observedRunningTime="2026-04-16 19:56:42.00640354 +0000 UTC m=+166.420978387" watchObservedRunningTime="2026-04-16 19:56:42.0074252 +0000 UTC m=+166.422000047" Apr 16 19:56:42.688474 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.687910 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c6848c758-ntsrd"] Apr 16 19:56:42.692049 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.692024 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.702929 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.702899 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6848c758-ntsrd"] Apr 16 19:56:42.844219 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.844178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-trusted-ca-bundle\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.844219 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.844228 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-oauth-serving-cert\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.844527 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.844245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96q5r\" (UniqueName: \"kubernetes.io/projected/877ee0d8-4699-492b-990c-9ccaf9c8452a-kube-api-access-96q5r\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.844527 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.844277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-service-ca\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.844527 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.844322 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-serving-cert\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.844527 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.844392 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-config\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.844527 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.844452 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-oauth-config\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.945147 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.945046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-trusted-ca-bundle\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.945147 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.945098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-oauth-serving-cert\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.945147 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.945144 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96q5r\" (UniqueName: \"kubernetes.io/projected/877ee0d8-4699-492b-990c-9ccaf9c8452a-kube-api-access-96q5r\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.945432 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.945174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-service-ca\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.945432 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.945241 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-serving-cert\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.945432 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.945286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-config\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.945432 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.945319 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-oauth-config\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.946101 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.946067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-oauth-serving-cert\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.946239 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.946115 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-service-ca\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.946239 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.946124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-config\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.946239 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.946173 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-trusted-ca-bundle\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.947932 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.947911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-oauth-config\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.948066 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.948048 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-serving-cert\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:42.953679 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:42.953657 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96q5r\" (UniqueName: \"kubernetes.io/projected/877ee0d8-4699-492b-990c-9ccaf9c8452a-kube-api-access-96q5r\") pod \"console-c6848c758-ntsrd\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:43.004505 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:43.004449 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:43.132970 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:43.132929 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6848c758-ntsrd"] Apr 16 19:56:43.982900 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:43.982862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6848c758-ntsrd" event={"ID":"877ee0d8-4699-492b-990c-9ccaf9c8452a","Type":"ContainerStarted","Data":"4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2"} Apr 16 19:56:43.982900 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:43.982902 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6848c758-ntsrd" event={"ID":"877ee0d8-4699-492b-990c-9ccaf9c8452a","Type":"ContainerStarted","Data":"292906d3b49dbdae07403bcc74713fc83cace3f3a0a7087ca023c0a9e34f4a10"} Apr 16 19:56:44.002177 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:44.002110 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c6848c758-ntsrd" podStartSLOduration=2.002091411 podStartE2EDuration="2.002091411s" podCreationTimestamp="2026-04-16 19:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:43.999925392 +0000 UTC m=+168.414500241" watchObservedRunningTime="2026-04-16 19:56:44.002091411 +0000 UTC m=+168.416666261" Apr 16 19:56:53.005638 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:53.005504 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:53.005638 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:53.005565 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:53.010514 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:53.010484 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:53.019039 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:53.019011 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:56:53.075553 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:56:53.075515 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69c875bd9d-zwv2w"] Apr 16 19:57:18.099174 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.099110 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69c875bd9d-zwv2w" podUID="d45a4281-5995-4b7a-aa2c-88f9d4490bbe" containerName="console" containerID="cri-o://ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16" gracePeriod=15 Apr 16 19:57:18.332676 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.332650 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69c875bd9d-zwv2w_d45a4281-5995-4b7a-aa2c-88f9d4490bbe/console/0.log" Apr 16 19:57:18.332810 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.332715 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:57:18.458363 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458323 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-service-ca\") pod \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " Apr 16 19:57:18.458557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458369 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-trusted-ca-bundle\") pod \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " Apr 16 19:57:18.458557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458400 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-oauth-config\") pod \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " Apr 16 19:57:18.458557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458426 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-config\") pod \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " Apr 16 19:57:18.458557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458448 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-oauth-serving-cert\") pod \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " Apr 16 19:57:18.458557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458468 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mmcq\" (UniqueName: \"kubernetes.io/projected/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-kube-api-access-7mmcq\") pod \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " Apr 16 19:57:18.458557 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458525 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-serving-cert\") pod \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\" (UID: \"d45a4281-5995-4b7a-aa2c-88f9d4490bbe\") " Apr 16 19:57:18.458884 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458851 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-service-ca" (OuterVolumeSpecName: "service-ca") pod "d45a4281-5995-4b7a-aa2c-88f9d4490bbe" (UID: "d45a4281-5995-4b7a-aa2c-88f9d4490bbe"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:18.458933 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.458878 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-config" (OuterVolumeSpecName: "console-config") pod "d45a4281-5995-4b7a-aa2c-88f9d4490bbe" (UID: "d45a4281-5995-4b7a-aa2c-88f9d4490bbe"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:18.459074 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.459050 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d45a4281-5995-4b7a-aa2c-88f9d4490bbe" (UID: "d45a4281-5995-4b7a-aa2c-88f9d4490bbe"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:18.459166 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.459142 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d45a4281-5995-4b7a-aa2c-88f9d4490bbe" (UID: "d45a4281-5995-4b7a-aa2c-88f9d4490bbe"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:18.460791 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.460760 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d45a4281-5995-4b7a-aa2c-88f9d4490bbe" (UID: "d45a4281-5995-4b7a-aa2c-88f9d4490bbe"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:18.460902 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.460783 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-kube-api-access-7mmcq" (OuterVolumeSpecName: "kube-api-access-7mmcq") pod "d45a4281-5995-4b7a-aa2c-88f9d4490bbe" (UID: "d45a4281-5995-4b7a-aa2c-88f9d4490bbe"). InnerVolumeSpecName "kube-api-access-7mmcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:57:18.460902 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.460809 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d45a4281-5995-4b7a-aa2c-88f9d4490bbe" (UID: "d45a4281-5995-4b7a-aa2c-88f9d4490bbe"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:18.559652 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.559608 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-service-ca\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:57:18.559652 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.559645 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-trusted-ca-bundle\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:57:18.559652 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.559658 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-oauth-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:57:18.559913 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.559672 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:57:18.559913 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.559684 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-oauth-serving-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:57:18.559913 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.559696 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mmcq\" (UniqueName: \"kubernetes.io/projected/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-kube-api-access-7mmcq\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:57:18.559913 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:18.559708 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d45a4281-5995-4b7a-aa2c-88f9d4490bbe-console-serving-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:57:19.098992 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.098963 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69c875bd9d-zwv2w_d45a4281-5995-4b7a-aa2c-88f9d4490bbe/console/0.log" Apr 16 19:57:19.099165 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.099006 2569 generic.go:358] "Generic (PLEG): container finished" podID="d45a4281-5995-4b7a-aa2c-88f9d4490bbe" containerID="ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16" exitCode=2 Apr 16 19:57:19.099165 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.099040 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c875bd9d-zwv2w" event={"ID":"d45a4281-5995-4b7a-aa2c-88f9d4490bbe","Type":"ContainerDied","Data":"ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16"} Apr 16 19:57:19.099165 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.099091 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69c875bd9d-zwv2w" event={"ID":"d45a4281-5995-4b7a-aa2c-88f9d4490bbe","Type":"ContainerDied","Data":"0f3bb8a6d9ff766331e57a8cf8d42afc80be492a2cb633517689dc5320329222"} Apr 16 19:57:19.099165 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.099111 2569 scope.go:117] "RemoveContainer" containerID="ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16" Apr 16 19:57:19.099165 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.099114 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69c875bd9d-zwv2w" Apr 16 19:57:19.108390 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.108360 2569 scope.go:117] "RemoveContainer" containerID="ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16" Apr 16 19:57:19.108802 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:57:19.108779 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16\": container with ID starting with ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16 not found: ID does not exist" containerID="ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16" Apr 16 19:57:19.108883 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.108812 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16"} err="failed to get container status \"ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16\": rpc error: code = NotFound desc = could not find container \"ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16\": container with ID starting with ba7eee8102ca05725d0fd7ae93020dea2bea120d95391623e8249a199f241c16 not found: ID does not exist" Apr 16 19:57:19.119879 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.119838 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69c875bd9d-zwv2w"] Apr 16 19:57:19.123679 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:19.123646 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69c875bd9d-zwv2w"] Apr 16 19:57:20.183406 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:20.183373 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45a4281-5995-4b7a-aa2c-88f9d4490bbe" path="/var/lib/kubelet/pods/d45a4281-5995-4b7a-aa2c-88f9d4490bbe/volumes" Apr 16 19:57:50.948310 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:50.948270 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69f599566b-xl9vs"] Apr 16 19:57:50.948774 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:50.948686 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d45a4281-5995-4b7a-aa2c-88f9d4490bbe" containerName="console" Apr 16 19:57:50.948774 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:50.948701 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45a4281-5995-4b7a-aa2c-88f9d4490bbe" containerName="console" Apr 16 19:57:50.948846 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:50.948776 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d45a4281-5995-4b7a-aa2c-88f9d4490bbe" containerName="console" Apr 16 19:57:50.950727 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:50.950701 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:50.963743 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:50.963711 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69f599566b-xl9vs"] Apr 16 19:57:51.037207 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.037159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-config\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.037387 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.037219 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-service-ca\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.037387 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.037327 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-oauth-config\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.037387 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.037378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-oauth-serving-cert\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.037527 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.037404 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ntql\" (UniqueName: \"kubernetes.io/projected/b8954c86-0e88-4680-a5f7-71e0d4810ed6-kube-api-access-2ntql\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.037527 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.037426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-serving-cert\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.037670 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.037555 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-trusted-ca-bundle\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.138196 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.138140 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-config\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.138196 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.138205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-service-ca\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.138479 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.138233 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-oauth-config\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.138479 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.138260 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-oauth-serving-cert\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.138479 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.138275 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ntql\" (UniqueName: \"kubernetes.io/projected/b8954c86-0e88-4680-a5f7-71e0d4810ed6-kube-api-access-2ntql\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.138479 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.138297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-serving-cert\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.138479 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.138329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-trusted-ca-bundle\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.139069 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.139042 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-service-ca\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.139175 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.139042 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-config\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.139175 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.139119 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-oauth-serving-cert\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.139341 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.139322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-trusted-ca-bundle\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.140907 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.140880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-oauth-config\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.141006 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.140974 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-serving-cert\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.147489 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.147456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ntql\" (UniqueName: \"kubernetes.io/projected/b8954c86-0e88-4680-a5f7-71e0d4810ed6-kube-api-access-2ntql\") pod \"console-69f599566b-xl9vs\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.262896 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.262792 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:57:51.391315 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:51.391187 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69f599566b-xl9vs"] Apr 16 19:57:51.394303 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:57:51.394274 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8954c86_0e88_4680_a5f7_71e0d4810ed6.slice/crio-452dc19e3987a033062954f5f61d48f7149411410d7291e2a1bb0806df6e0eb7 WatchSource:0}: Error finding container 452dc19e3987a033062954f5f61d48f7149411410d7291e2a1bb0806df6e0eb7: Status 404 returned error can't find the container with id 452dc19e3987a033062954f5f61d48f7149411410d7291e2a1bb0806df6e0eb7 Apr 16 19:57:52.200538 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:52.200497 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f599566b-xl9vs" event={"ID":"b8954c86-0e88-4680-a5f7-71e0d4810ed6","Type":"ContainerStarted","Data":"1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932"} Apr 16 19:57:52.200538 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:52.200534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f599566b-xl9vs" event={"ID":"b8954c86-0e88-4680-a5f7-71e0d4810ed6","Type":"ContainerStarted","Data":"452dc19e3987a033062954f5f61d48f7149411410d7291e2a1bb0806df6e0eb7"} Apr 16 19:57:52.219670 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:57:52.219619 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69f599566b-xl9vs" podStartSLOduration=2.219601029 podStartE2EDuration="2.219601029s" podCreationTimestamp="2026-04-16 19:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:57:52.216949585 +0000 UTC m=+236.631524433" watchObservedRunningTime="2026-04-16 19:57:52.219601029 +0000 UTC m=+236.634175869" Apr 16 19:58:01.263400 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:01.263356 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:58:01.263400 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:01.263403 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:58:01.268214 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:01.268184 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:58:02.235704 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:02.235673 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69f599566b-xl9vs" Apr 16 19:58:02.286691 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:02.286651 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c6848c758-ntsrd"] Apr 16 19:58:27.308996 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.308891 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c6848c758-ntsrd" podUID="877ee0d8-4699-492b-990c-9ccaf9c8452a" containerName="console" containerID="cri-o://4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2" gracePeriod=15 Apr 16 19:58:27.549308 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.549281 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6848c758-ntsrd_877ee0d8-4699-492b-990c-9ccaf9c8452a/console/0.log" Apr 16 19:58:27.549448 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.549347 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:58:27.568123 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568039 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-oauth-config\") pod \"877ee0d8-4699-492b-990c-9ccaf9c8452a\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " Apr 16 19:58:27.568123 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568098 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-config\") pod \"877ee0d8-4699-492b-990c-9ccaf9c8452a\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " Apr 16 19:58:27.568342 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568139 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-trusted-ca-bundle\") pod \"877ee0d8-4699-492b-990c-9ccaf9c8452a\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " Apr 16 19:58:27.568342 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568163 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-service-ca\") pod \"877ee0d8-4699-492b-990c-9ccaf9c8452a\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " Apr 16 19:58:27.568342 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568203 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-oauth-serving-cert\") pod \"877ee0d8-4699-492b-990c-9ccaf9c8452a\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " Apr 16 19:58:27.568545 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568475 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96q5r\" (UniqueName: \"kubernetes.io/projected/877ee0d8-4699-492b-990c-9ccaf9c8452a-kube-api-access-96q5r\") pod \"877ee0d8-4699-492b-990c-9ccaf9c8452a\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " Apr 16 19:58:27.568545 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568522 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-serving-cert\") pod \"877ee0d8-4699-492b-990c-9ccaf9c8452a\" (UID: \"877ee0d8-4699-492b-990c-9ccaf9c8452a\") " Apr 16 19:58:27.568746 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568614 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-config" (OuterVolumeSpecName: "console-config") pod "877ee0d8-4699-492b-990c-9ccaf9c8452a" (UID: "877ee0d8-4699-492b-990c-9ccaf9c8452a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:58:27.568746 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568656 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "877ee0d8-4699-492b-990c-9ccaf9c8452a" (UID: "877ee0d8-4699-492b-990c-9ccaf9c8452a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:58:27.568746 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.568692 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-service-ca" (OuterVolumeSpecName: "service-ca") pod "877ee0d8-4699-492b-990c-9ccaf9c8452a" (UID: "877ee0d8-4699-492b-990c-9ccaf9c8452a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:58:27.569088 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.569061 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "877ee0d8-4699-492b-990c-9ccaf9c8452a" (UID: "877ee0d8-4699-492b-990c-9ccaf9c8452a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:58:27.569196 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.569068 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:58:27.569196 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.569115 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-service-ca\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:58:27.569196 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.569133 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-oauth-serving-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:58:27.570869 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.570674 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "877ee0d8-4699-492b-990c-9ccaf9c8452a" (UID: "877ee0d8-4699-492b-990c-9ccaf9c8452a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:58:27.572634 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.572609 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877ee0d8-4699-492b-990c-9ccaf9c8452a-kube-api-access-96q5r" (OuterVolumeSpecName: "kube-api-access-96q5r") pod "877ee0d8-4699-492b-990c-9ccaf9c8452a" (UID: "877ee0d8-4699-492b-990c-9ccaf9c8452a"). InnerVolumeSpecName "kube-api-access-96q5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:58:27.572742 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.572607 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "877ee0d8-4699-492b-990c-9ccaf9c8452a" (UID: "877ee0d8-4699-492b-990c-9ccaf9c8452a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:58:27.670496 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.670462 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96q5r\" (UniqueName: \"kubernetes.io/projected/877ee0d8-4699-492b-990c-9ccaf9c8452a-kube-api-access-96q5r\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:58:27.670496 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.670491 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-serving-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:58:27.670496 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.670501 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/877ee0d8-4699-492b-990c-9ccaf9c8452a-console-oauth-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:58:27.670774 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:27.670512 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877ee0d8-4699-492b-990c-9ccaf9c8452a-trusted-ca-bundle\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:58:28.312664 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.312635 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6848c758-ntsrd_877ee0d8-4699-492b-990c-9ccaf9c8452a/console/0.log" Apr 16 19:58:28.313058 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.312678 2569 generic.go:358] "Generic (PLEG): container finished" podID="877ee0d8-4699-492b-990c-9ccaf9c8452a" containerID="4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2" exitCode=2 Apr 16 19:58:28.313058 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.312770 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6848c758-ntsrd" Apr 16 19:58:28.313058 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.312769 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6848c758-ntsrd" event={"ID":"877ee0d8-4699-492b-990c-9ccaf9c8452a","Type":"ContainerDied","Data":"4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2"} Apr 16 19:58:28.313058 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.312811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6848c758-ntsrd" event={"ID":"877ee0d8-4699-492b-990c-9ccaf9c8452a","Type":"ContainerDied","Data":"292906d3b49dbdae07403bcc74713fc83cace3f3a0a7087ca023c0a9e34f4a10"} Apr 16 19:58:28.313058 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.312828 2569 scope.go:117] "RemoveContainer" containerID="4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2" Apr 16 19:58:28.321359 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.321338 2569 scope.go:117] "RemoveContainer" containerID="4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2" Apr 16 19:58:28.321907 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:58:28.321883 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2\": container with ID starting with 4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2 not found: ID does not exist" containerID="4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2" Apr 16 19:58:28.321978 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.321919 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2"} err="failed to get container status \"4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2\": rpc error: code = NotFound desc = could not find container \"4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2\": container with ID starting with 4883410531ecf436337808109a5c383652539a2b0a1bd50f1f5177ce154e0ab2 not found: ID does not exist" Apr 16 19:58:28.327491 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.327460 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c6848c758-ntsrd"] Apr 16 19:58:28.331331 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:28.331305 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c6848c758-ntsrd"] Apr 16 19:58:30.182966 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:30.182930 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877ee0d8-4699-492b-990c-9ccaf9c8452a" path="/var/lib/kubelet/pods/877ee0d8-4699-492b-990c-9ccaf9c8452a/volumes" Apr 16 19:58:56.031158 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:56.031126 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 19:58:56.031741 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:56.031136 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 19:58:56.041294 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:58:56.041268 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:59:07.271419 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.271388 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj"] Apr 16 19:59:07.272746 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.271784 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="877ee0d8-4699-492b-990c-9ccaf9c8452a" containerName="console" Apr 16 19:59:07.272746 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.271798 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="877ee0d8-4699-492b-990c-9ccaf9c8452a" containerName="console" Apr 16 19:59:07.272746 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.271863 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="877ee0d8-4699-492b-990c-9ccaf9c8452a" containerName="console" Apr 16 19:59:07.273859 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.273841 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.275935 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.275911 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:59:07.276071 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.275968 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:59:07.276297 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.276281 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-cpnvw\"" Apr 16 19:59:07.282796 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.282767 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj"] Apr 16 19:59:07.408332 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.408282 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh665\" (UniqueName: \"kubernetes.io/projected/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-kube-api-access-qh665\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.408539 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.408371 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.408539 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.408396 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.508959 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.508919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.508959 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.508963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.509165 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.509012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh665\" (UniqueName: \"kubernetes.io/projected/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-kube-api-access-qh665\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.509358 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.509337 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.509397 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.509354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.517417 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.517385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh665\" (UniqueName: \"kubernetes.io/projected/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-kube-api-access-qh665\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.584690 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.584593 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:07.709525 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.709500 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj"] Apr 16 19:59:07.711803 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:59:07.711769 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860aa8c7_5fd8_44ae_9652_0aa55da5ff12.slice/crio-146ffbbc527aadb031b7c14ffb44ac3d744b961d538d0fd343c151dd77219010 WatchSource:0}: Error finding container 146ffbbc527aadb031b7c14ffb44ac3d744b961d538d0fd343c151dd77219010: Status 404 returned error can't find the container with id 146ffbbc527aadb031b7c14ffb44ac3d744b961d538d0fd343c151dd77219010 Apr 16 19:59:07.713701 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:07.713681 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:59:08.442993 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:08.442946 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" event={"ID":"860aa8c7-5fd8-44ae-9652-0aa55da5ff12","Type":"ContainerStarted","Data":"146ffbbc527aadb031b7c14ffb44ac3d744b961d538d0fd343c151dd77219010"} Apr 16 19:59:13.461724 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:13.461687 2569 generic.go:358] "Generic (PLEG): container finished" podID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerID="4288406ab60bd29ff17a099d087ffe23947024ca766669d918e7d2a94e7e6ce7" exitCode=0 Apr 16 19:59:13.462114 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:13.461783 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" event={"ID":"860aa8c7-5fd8-44ae-9652-0aa55da5ff12","Type":"ContainerDied","Data":"4288406ab60bd29ff17a099d087ffe23947024ca766669d918e7d2a94e7e6ce7"} Apr 16 19:59:16.473314 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:16.473273 2569 generic.go:358] "Generic (PLEG): container finished" podID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerID="cd691c11b315fb73e3769030f814bafa1081bba9b63f3b2ff6792d386962b779" exitCode=0 Apr 16 19:59:16.473738 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:16.473361 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" event={"ID":"860aa8c7-5fd8-44ae-9652-0aa55da5ff12","Type":"ContainerDied","Data":"cd691c11b315fb73e3769030f814bafa1081bba9b63f3b2ff6792d386962b779"} Apr 16 19:59:24.500455 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:24.500412 2569 generic.go:358] "Generic (PLEG): container finished" podID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerID="a8af15342ce8b7ad8c19ff7c145d95cab8fad3884013866b6acf3fdcd70fec30" exitCode=0 Apr 16 19:59:24.500947 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:24.500507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" event={"ID":"860aa8c7-5fd8-44ae-9652-0aa55da5ff12","Type":"ContainerDied","Data":"a8af15342ce8b7ad8c19ff7c145d95cab8fad3884013866b6acf3fdcd70fec30"} Apr 16 19:59:25.633749 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.633721 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:25.690663 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.690623 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh665\" (UniqueName: \"kubernetes.io/projected/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-kube-api-access-qh665\") pod \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " Apr 16 19:59:25.690836 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.690704 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-util\") pod \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " Apr 16 19:59:25.690836 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.690747 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-bundle\") pod \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\" (UID: \"860aa8c7-5fd8-44ae-9652-0aa55da5ff12\") " Apr 16 19:59:25.691304 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.691272 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-bundle" (OuterVolumeSpecName: "bundle") pod "860aa8c7-5fd8-44ae-9652-0aa55da5ff12" (UID: "860aa8c7-5fd8-44ae-9652-0aa55da5ff12"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:59:25.692982 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.692956 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-kube-api-access-qh665" (OuterVolumeSpecName: "kube-api-access-qh665") pod "860aa8c7-5fd8-44ae-9652-0aa55da5ff12" (UID: "860aa8c7-5fd8-44ae-9652-0aa55da5ff12"). InnerVolumeSpecName "kube-api-access-qh665". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:59:25.695967 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.695933 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-util" (OuterVolumeSpecName: "util") pod "860aa8c7-5fd8-44ae-9652-0aa55da5ff12" (UID: "860aa8c7-5fd8-44ae-9652-0aa55da5ff12"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:59:25.791391 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.791308 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-util\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:59:25.791391 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.791339 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-bundle\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:59:25.791391 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:25.791348 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qh665\" (UniqueName: \"kubernetes.io/projected/860aa8c7-5fd8-44ae-9652-0aa55da5ff12-kube-api-access-qh665\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 19:59:26.509148 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:26.509002 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" event={"ID":"860aa8c7-5fd8-44ae-9652-0aa55da5ff12","Type":"ContainerDied","Data":"146ffbbc527aadb031b7c14ffb44ac3d744b961d538d0fd343c151dd77219010"} Apr 16 19:59:26.509148 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:26.509042 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="146ffbbc527aadb031b7c14ffb44ac3d744b961d538d0fd343c151dd77219010" Apr 16 19:59:26.509148 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:26.509080 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cg6vgj" Apr 16 19:59:28.855449 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.855412 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq"] Apr 16 19:59:28.855919 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.855813 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerName="util" Apr 16 19:59:28.855919 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.855826 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerName="util" Apr 16 19:59:28.855919 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.855839 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerName="extract" Apr 16 19:59:28.855919 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.855845 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerName="extract" Apr 16 19:59:28.855919 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.855853 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerName="pull" Apr 16 19:59:28.855919 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.855858 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerName="pull" Apr 16 19:59:28.856103 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.855939 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="860aa8c7-5fd8-44ae-9652-0aa55da5ff12" containerName="extract" Apr 16 19:59:28.905404 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.905363 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq"] Apr 16 19:59:28.905595 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.905497 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:28.908089 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.908065 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 19:59:28.908217 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.908113 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 19:59:28.908217 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.908132 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 19:59:28.908217 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:28.908143 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-mldhm\"" Apr 16 19:59:29.020724 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.020687 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/18fb5fdc-d533-470e-910d-8ff8a0cfbce3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-sscpq\" (UID: \"18fb5fdc-d533-470e-910d-8ff8a0cfbce3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:29.020931 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.020750 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjd8\" (UniqueName: \"kubernetes.io/projected/18fb5fdc-d533-470e-910d-8ff8a0cfbce3-kube-api-access-tsjd8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-sscpq\" (UID: \"18fb5fdc-d533-470e-910d-8ff8a0cfbce3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:29.121768 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.121688 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/18fb5fdc-d533-470e-910d-8ff8a0cfbce3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-sscpq\" (UID: \"18fb5fdc-d533-470e-910d-8ff8a0cfbce3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:29.121892 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.121765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjd8\" (UniqueName: \"kubernetes.io/projected/18fb5fdc-d533-470e-910d-8ff8a0cfbce3-kube-api-access-tsjd8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-sscpq\" (UID: \"18fb5fdc-d533-470e-910d-8ff8a0cfbce3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:29.124137 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.124116 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/18fb5fdc-d533-470e-910d-8ff8a0cfbce3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-sscpq\" (UID: \"18fb5fdc-d533-470e-910d-8ff8a0cfbce3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:29.129887 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.129859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjd8\" (UniqueName: \"kubernetes.io/projected/18fb5fdc-d533-470e-910d-8ff8a0cfbce3-kube-api-access-tsjd8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-sscpq\" (UID: \"18fb5fdc-d533-470e-910d-8ff8a0cfbce3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:29.216399 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.216349 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:29.353068 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.353029 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq"] Apr 16 19:59:29.356436 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:59:29.356403 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18fb5fdc_d533_470e_910d_8ff8a0cfbce3.slice/crio-1b3f17f1077bb084cd7fa2da419efed0f298ae8e5564028b2a983525349aa6d8 WatchSource:0}: Error finding container 1b3f17f1077bb084cd7fa2da419efed0f298ae8e5564028b2a983525349aa6d8: Status 404 returned error can't find the container with id 1b3f17f1077bb084cd7fa2da419efed0f298ae8e5564028b2a983525349aa6d8 Apr 16 19:59:29.519881 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:29.519845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" event={"ID":"18fb5fdc-d533-470e-910d-8ff8a0cfbce3","Type":"ContainerStarted","Data":"1b3f17f1077bb084cd7fa2da419efed0f298ae8e5564028b2a983525349aa6d8"} Apr 16 19:59:33.511794 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.511753 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v"] Apr 16 19:59:33.515191 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.515171 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.517615 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.517566 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 19:59:33.517731 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.517566 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5b9qt\"" Apr 16 19:59:33.517999 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.517982 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 19:59:33.526456 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.526429 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v"] Apr 16 19:59:33.538140 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.538084 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" event={"ID":"18fb5fdc-d533-470e-910d-8ff8a0cfbce3","Type":"ContainerStarted","Data":"7fa5ca1a2e9685075716a166cd30c12f0f8d782c86f685d5966bc05f3f205c45"} Apr 16 19:59:33.538439 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.538401 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:33.563811 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.563754 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" podStartSLOduration=2.35230123 podStartE2EDuration="5.563735512s" podCreationTimestamp="2026-04-16 19:59:28 +0000 UTC" firstStartedPulling="2026-04-16 19:59:29.358236372 +0000 UTC m=+333.772811197" lastFinishedPulling="2026-04-16 19:59:32.569670653 +0000 UTC m=+336.984245479" observedRunningTime="2026-04-16 19:59:33.563219768 +0000 UTC m=+337.977794617" watchObservedRunningTime="2026-04-16 19:59:33.563735512 +0000 UTC m=+337.978310360" Apr 16 19:59:33.666610 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.666542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.666823 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.666762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/de31d68a-b28b-4712-a0e2-fd2146b9d13e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.666823 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.666818 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwmt\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-kube-api-access-xdwmt\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.768420 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.768322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwmt\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-kube-api-access-xdwmt\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.768420 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.768398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.768625 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.768509 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/de31d68a-b28b-4712-a0e2-fd2146b9d13e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.768625 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:33.768535 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:59:33.768625 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:33.768556 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:59:33.768625 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:33.768592 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v: references non-existent secret key: tls.crt Apr 16 19:59:33.768759 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:33.768656 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates podName:de31d68a-b28b-4712-a0e2-fd2146b9d13e nodeName:}" failed. No retries permitted until 2026-04-16 19:59:34.26864032 +0000 UTC m=+338.683215147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates") pod "keda-metrics-apiserver-7c9f485588-h6t7v" (UID: "de31d68a-b28b-4712-a0e2-fd2146b9d13e") : references non-existent secret key: tls.crt Apr 16 19:59:33.769018 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.768997 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/de31d68a-b28b-4712-a0e2-fd2146b9d13e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.777561 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.777533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwmt\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-kube-api-access-xdwmt\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:33.908900 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.908863 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-bc4gt"] Apr 16 19:59:33.913054 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.913033 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:33.914851 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.914823 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 19:59:33.919293 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.919269 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bc4gt"] Apr 16 19:59:33.971092 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.971051 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fda21da-5ff0-4d79-a08e-7c3b055799f8-certificates\") pod \"keda-admission-cf49989db-bc4gt\" (UID: \"3fda21da-5ff0-4d79-a08e-7c3b055799f8\") " pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:33.971270 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:33.971100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8r9l\" (UniqueName: \"kubernetes.io/projected/3fda21da-5ff0-4d79-a08e-7c3b055799f8-kube-api-access-j8r9l\") pod \"keda-admission-cf49989db-bc4gt\" (UID: \"3fda21da-5ff0-4d79-a08e-7c3b055799f8\") " pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:34.072635 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:34.072533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fda21da-5ff0-4d79-a08e-7c3b055799f8-certificates\") pod \"keda-admission-cf49989db-bc4gt\" (UID: \"3fda21da-5ff0-4d79-a08e-7c3b055799f8\") " pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:34.072635 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:34.072609 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8r9l\" (UniqueName: \"kubernetes.io/projected/3fda21da-5ff0-4d79-a08e-7c3b055799f8-kube-api-access-j8r9l\") pod \"keda-admission-cf49989db-bc4gt\" (UID: \"3fda21da-5ff0-4d79-a08e-7c3b055799f8\") " pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:34.075184 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:34.075146 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fda21da-5ff0-4d79-a08e-7c3b055799f8-certificates\") pod \"keda-admission-cf49989db-bc4gt\" (UID: \"3fda21da-5ff0-4d79-a08e-7c3b055799f8\") " pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:34.080015 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:34.079987 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8r9l\" (UniqueName: \"kubernetes.io/projected/3fda21da-5ff0-4d79-a08e-7c3b055799f8-kube-api-access-j8r9l\") pod \"keda-admission-cf49989db-bc4gt\" (UID: \"3fda21da-5ff0-4d79-a08e-7c3b055799f8\") " pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:34.226663 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:34.226620 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:34.275280 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:34.275237 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:34.275452 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:34.275429 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:59:34.275500 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:34.275457 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:59:34.275500 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:34.275482 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v: references non-existent secret key: tls.crt Apr 16 19:59:34.275633 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:34.275619 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates podName:de31d68a-b28b-4712-a0e2-fd2146b9d13e nodeName:}" failed. No retries permitted until 2026-04-16 19:59:35.275556593 +0000 UTC m=+339.690131438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates") pod "keda-metrics-apiserver-7c9f485588-h6t7v" (UID: "de31d68a-b28b-4712-a0e2-fd2146b9d13e") : references non-existent secret key: tls.crt Apr 16 19:59:34.349854 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:34.349770 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bc4gt"] Apr 16 19:59:34.353419 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:59:34.353381 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fda21da_5ff0_4d79_a08e_7c3b055799f8.slice/crio-0e3b2da71664d73a3c1247abb14e0dcd6d872f3a7628a8a553d335bb2fb4a48b WatchSource:0}: Error finding container 0e3b2da71664d73a3c1247abb14e0dcd6d872f3a7628a8a553d335bb2fb4a48b: Status 404 returned error can't find the container with id 0e3b2da71664d73a3c1247abb14e0dcd6d872f3a7628a8a553d335bb2fb4a48b Apr 16 19:59:34.542637 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:34.542600 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bc4gt" event={"ID":"3fda21da-5ff0-4d79-a08e-7c3b055799f8","Type":"ContainerStarted","Data":"0e3b2da71664d73a3c1247abb14e0dcd6d872f3a7628a8a553d335bb2fb4a48b"} Apr 16 19:59:35.287443 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:35.287395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:35.287685 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:35.287524 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:59:35.287685 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:35.287549 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:59:35.287685 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:35.287596 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v: references non-existent secret key: tls.crt Apr 16 19:59:35.287685 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:35.287659 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates podName:de31d68a-b28b-4712-a0e2-fd2146b9d13e nodeName:}" failed. No retries permitted until 2026-04-16 19:59:37.287639433 +0000 UTC m=+341.702214273 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates") pod "keda-metrics-apiserver-7c9f485588-h6t7v" (UID: "de31d68a-b28b-4712-a0e2-fd2146b9d13e") : references non-existent secret key: tls.crt Apr 16 19:59:36.551492 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:36.551457 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bc4gt" event={"ID":"3fda21da-5ff0-4d79-a08e-7c3b055799f8","Type":"ContainerStarted","Data":"4ab41cffb27489bc50652a02b0ab0761f6f56b5538d95349c73fb126d4f5ac20"} Apr 16 19:59:36.551905 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:36.551550 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 19:59:36.568976 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:36.568922 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-bc4gt" podStartSLOduration=2.061375526 podStartE2EDuration="3.568908349s" podCreationTimestamp="2026-04-16 19:59:33 +0000 UTC" firstStartedPulling="2026-04-16 19:59:34.354768149 +0000 UTC m=+338.769342975" lastFinishedPulling="2026-04-16 19:59:35.862300968 +0000 UTC m=+340.276875798" observedRunningTime="2026-04-16 19:59:36.566669156 +0000 UTC m=+340.981244003" watchObservedRunningTime="2026-04-16 19:59:36.568908349 +0000 UTC m=+340.983483196" Apr 16 19:59:37.306953 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:37.306904 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:37.307161 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:37.307049 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:59:37.307161 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:37.307071 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:59:37.307161 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:37.307090 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v: references non-existent secret key: tls.crt Apr 16 19:59:37.307161 ip-10-0-139-205 kubenswrapper[2569]: E0416 19:59:37.307145 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates podName:de31d68a-b28b-4712-a0e2-fd2146b9d13e nodeName:}" failed. No retries permitted until 2026-04-16 19:59:41.307130701 +0000 UTC m=+345.721705527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates") pod "keda-metrics-apiserver-7c9f485588-h6t7v" (UID: "de31d68a-b28b-4712-a0e2-fd2146b9d13e") : references non-existent secret key: tls.crt Apr 16 19:59:41.352875 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:41.352832 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:41.355359 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:41.355335 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/de31d68a-b28b-4712-a0e2-fd2146b9d13e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-h6t7v\" (UID: \"de31d68a-b28b-4712-a0e2-fd2146b9d13e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:41.626862 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:41.626759 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:41.751753 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:41.751717 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v"] Apr 16 19:59:41.754226 ip-10-0-139-205 kubenswrapper[2569]: W0416 19:59:41.754198 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde31d68a_b28b_4712_a0e2_fd2146b9d13e.slice/crio-2f18112d588ae4df29c8679ff91e1a040fe568e1b4009b9d6f6a0b1899c44868 WatchSource:0}: Error finding container 2f18112d588ae4df29c8679ff91e1a040fe568e1b4009b9d6f6a0b1899c44868: Status 404 returned error can't find the container with id 2f18112d588ae4df29c8679ff91e1a040fe568e1b4009b9d6f6a0b1899c44868 Apr 16 19:59:42.572972 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:42.572930 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" event={"ID":"de31d68a-b28b-4712-a0e2-fd2146b9d13e","Type":"ContainerStarted","Data":"2f18112d588ae4df29c8679ff91e1a040fe568e1b4009b9d6f6a0b1899c44868"} Apr 16 19:59:44.581259 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:44.581224 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" event={"ID":"de31d68a-b28b-4712-a0e2-fd2146b9d13e","Type":"ContainerStarted","Data":"85a2e7f2db0b78f43c86eafa5f70bcd2b38d009ac67b72f6d800392d11d9f70c"} Apr 16 19:59:44.581677 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:44.581298 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:44.598251 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:44.598159 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" podStartSLOduration=9.296984788 podStartE2EDuration="11.598143709s" podCreationTimestamp="2026-04-16 19:59:33 +0000 UTC" firstStartedPulling="2026-04-16 19:59:41.755970343 +0000 UTC m=+346.170545172" lastFinishedPulling="2026-04-16 19:59:44.057128766 +0000 UTC m=+348.471704093" observedRunningTime="2026-04-16 19:59:44.596422781 +0000 UTC m=+349.010997628" watchObservedRunningTime="2026-04-16 19:59:44.598143709 +0000 UTC m=+349.012718551" Apr 16 19:59:54.545486 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:54.545456 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-sscpq" Apr 16 19:59:55.589002 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:55.588970 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-h6t7v" Apr 16 19:59:57.557130 ip-10-0-139-205 kubenswrapper[2569]: I0416 19:59:57.557093 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-bc4gt" Apr 16 20:00:42.053409 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.053366 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q"] Apr 16 20:00:42.057994 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.057967 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:42.064817 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.064786 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:00:42.064817 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.064782 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 20:00:42.066363 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.066336 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:00:42.066489 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.066395 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-p26pm\"" Apr 16 20:00:42.072169 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.072142 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q"] Apr 16 20:00:42.181811 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.181779 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2j49\" (UniqueName: \"kubernetes.io/projected/72c21fef-3270-41f9-988e-35b6ea77cbc0-kube-api-access-d2j49\") pod \"llmisvc-controller-manager-68cc5db7c4-vgm8q\" (UID: \"72c21fef-3270-41f9-988e-35b6ea77cbc0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:42.182005 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.181939 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72c21fef-3270-41f9-988e-35b6ea77cbc0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vgm8q\" (UID: \"72c21fef-3270-41f9-988e-35b6ea77cbc0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:42.283396 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.283351 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72c21fef-3270-41f9-988e-35b6ea77cbc0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vgm8q\" (UID: \"72c21fef-3270-41f9-988e-35b6ea77cbc0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:42.283648 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.283423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2j49\" (UniqueName: \"kubernetes.io/projected/72c21fef-3270-41f9-988e-35b6ea77cbc0-kube-api-access-d2j49\") pod \"llmisvc-controller-manager-68cc5db7c4-vgm8q\" (UID: \"72c21fef-3270-41f9-988e-35b6ea77cbc0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:42.283648 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:00:42.283540 2569 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 20:00:42.283777 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:00:42.283657 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72c21fef-3270-41f9-988e-35b6ea77cbc0-cert podName:72c21fef-3270-41f9-988e-35b6ea77cbc0 nodeName:}" failed. No retries permitted until 2026-04-16 20:00:42.783637039 +0000 UTC m=+407.198211878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72c21fef-3270-41f9-988e-35b6ea77cbc0-cert") pod "llmisvc-controller-manager-68cc5db7c4-vgm8q" (UID: "72c21fef-3270-41f9-988e-35b6ea77cbc0") : secret "llmisvc-webhook-server-cert" not found Apr 16 20:00:42.297218 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.297187 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2j49\" (UniqueName: \"kubernetes.io/projected/72c21fef-3270-41f9-988e-35b6ea77cbc0-kube-api-access-d2j49\") pod \"llmisvc-controller-manager-68cc5db7c4-vgm8q\" (UID: \"72c21fef-3270-41f9-988e-35b6ea77cbc0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:42.788261 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.788220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72c21fef-3270-41f9-988e-35b6ea77cbc0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vgm8q\" (UID: \"72c21fef-3270-41f9-988e-35b6ea77cbc0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:42.790792 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.790753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72c21fef-3270-41f9-988e-35b6ea77cbc0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vgm8q\" (UID: \"72c21fef-3270-41f9-988e-35b6ea77cbc0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:42.968603 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:42.968546 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:43.096210 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:43.096176 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q"] Apr 16 20:00:43.099726 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:00:43.099686 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod72c21fef_3270_41f9_988e_35b6ea77cbc0.slice/crio-8b6d6a6f2ebefdb7579e0047759618232bafd0e4d7cc86cc54c1427babb93718 WatchSource:0}: Error finding container 8b6d6a6f2ebefdb7579e0047759618232bafd0e4d7cc86cc54c1427babb93718: Status 404 returned error can't find the container with id 8b6d6a6f2ebefdb7579e0047759618232bafd0e4d7cc86cc54c1427babb93718 Apr 16 20:00:43.794539 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:43.794495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" event={"ID":"72c21fef-3270-41f9-988e-35b6ea77cbc0","Type":"ContainerStarted","Data":"8b6d6a6f2ebefdb7579e0047759618232bafd0e4d7cc86cc54c1427babb93718"} Apr 16 20:00:48.813981 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:48.813946 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" event={"ID":"72c21fef-3270-41f9-988e-35b6ea77cbc0","Type":"ContainerStarted","Data":"1b51036519489f0dc2550db4246d59402485c5d0d2faedf1ad89db777a2cd260"} Apr 16 20:00:48.814392 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:48.814063 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:00:48.831592 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:00:48.831533 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" podStartSLOduration=1.736513038 podStartE2EDuration="6.831517151s" podCreationTimestamp="2026-04-16 20:00:42 +0000 UTC" firstStartedPulling="2026-04-16 20:00:43.100900857 +0000 UTC m=+407.515475684" lastFinishedPulling="2026-04-16 20:00:48.195904959 +0000 UTC m=+412.610479797" observedRunningTime="2026-04-16 20:00:48.830186894 +0000 UTC m=+413.244761742" watchObservedRunningTime="2026-04-16 20:00:48.831517151 +0000 UTC m=+413.246092032" Apr 16 20:01:19.819380 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:19.819290 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgm8q" Apr 16 20:01:47.815768 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.815730 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b7f7d9c7d-d4jp2"] Apr 16 20:01:47.818283 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.818263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.835552 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.835519 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b7f7d9c7d-d4jp2"] Apr 16 20:01:47.851809 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.851773 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-serving-cert\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.851809 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.851810 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hvv5\" (UniqueName: \"kubernetes.io/projected/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-kube-api-access-8hvv5\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.852006 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.851839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-config\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.852006 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.851896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-oauth-config\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.852077 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.852015 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-service-ca\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.852077 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.852041 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-trusted-ca-bundle\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.852077 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.852063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-oauth-serving-cert\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.952973 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.952931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-oauth-config\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.953149 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.952994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-service-ca\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.953149 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.953011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-trusted-ca-bundle\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.953149 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.953030 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-oauth-serving-cert\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.953149 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.953084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-serving-cert\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.953149 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.953109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hvv5\" (UniqueName: \"kubernetes.io/projected/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-kube-api-access-8hvv5\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.953388 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.953153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-config\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.953909 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.953880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-config\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.953909 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.953902 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-oauth-serving-cert\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.954102 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.954039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-service-ca\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.954102 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.954083 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-trusted-ca-bundle\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.955656 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.955634 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-oauth-config\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.955733 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.955717 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-console-serving-cert\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:47.961067 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:47.961044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hvv5\" (UniqueName: \"kubernetes.io/projected/0147df20-106b-4a7a-a8ad-ea3ce5d89feb-kube-api-access-8hvv5\") pod \"console-6b7f7d9c7d-d4jp2\" (UID: \"0147df20-106b-4a7a-a8ad-ea3ce5d89feb\") " pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:48.128345 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:48.128252 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:48.260168 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:48.260141 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b7f7d9c7d-d4jp2"] Apr 16 20:01:48.262646 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:01:48.262615 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0147df20_106b_4a7a_a8ad_ea3ce5d89feb.slice/crio-5ef7f607ae7deb5a2fba56a775fa0249f9ce08f7e2102b74e3fb8bbf184dbfdb WatchSource:0}: Error finding container 5ef7f607ae7deb5a2fba56a775fa0249f9ce08f7e2102b74e3fb8bbf184dbfdb: Status 404 returned error can't find the container with id 5ef7f607ae7deb5a2fba56a775fa0249f9ce08f7e2102b74e3fb8bbf184dbfdb Apr 16 20:01:49.026850 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:49.026812 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7f7d9c7d-d4jp2" event={"ID":"0147df20-106b-4a7a-a8ad-ea3ce5d89feb","Type":"ContainerStarted","Data":"e88532a9e1fce18ed19d4cf62829b2cf8db75a7c92afe5b18c9c0f4e819f68a7"} Apr 16 20:01:49.026850 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:49.026854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7f7d9c7d-d4jp2" event={"ID":"0147df20-106b-4a7a-a8ad-ea3ce5d89feb","Type":"ContainerStarted","Data":"5ef7f607ae7deb5a2fba56a775fa0249f9ce08f7e2102b74e3fb8bbf184dbfdb"} Apr 16 20:01:49.045172 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:49.045122 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b7f7d9c7d-d4jp2" podStartSLOduration=2.045107708 podStartE2EDuration="2.045107708s" podCreationTimestamp="2026-04-16 20:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:01:49.043831346 +0000 UTC m=+473.458406193" watchObservedRunningTime="2026-04-16 20:01:49.045107708 +0000 UTC m=+473.459682557" Apr 16 20:01:58.128747 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:58.128695 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:58.128747 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:58.128758 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:58.133615 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:58.133561 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:59.067100 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:59.067067 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b7f7d9c7d-d4jp2" Apr 16 20:01:59.125515 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:01:59.125482 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69f599566b-xl9vs"] Apr 16 20:02:24.152617 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.152560 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69f599566b-xl9vs" podUID="b8954c86-0e88-4680-a5f7-71e0d4810ed6" containerName="console" containerID="cri-o://1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932" gracePeriod=15 Apr 16 20:02:24.385208 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.385182 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69f599566b-xl9vs_b8954c86-0e88-4680-a5f7-71e0d4810ed6/console/0.log" Apr 16 20:02:24.385366 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.385246 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f599566b-xl9vs" Apr 16 20:02:24.476049 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476010 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-trusted-ca-bundle\") pod \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " Apr 16 20:02:24.476234 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476105 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-config\") pod \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " Apr 16 20:02:24.476234 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476158 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ntql\" (UniqueName: \"kubernetes.io/projected/b8954c86-0e88-4680-a5f7-71e0d4810ed6-kube-api-access-2ntql\") pod \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " Apr 16 20:02:24.476234 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476187 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-oauth-config\") pod \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " Apr 16 20:02:24.476234 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476216 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-service-ca\") pod \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " Apr 16 20:02:24.476442 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476271 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-serving-cert\") pod \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " Apr 16 20:02:24.476442 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476333 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-oauth-serving-cert\") pod \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\" (UID: \"b8954c86-0e88-4680-a5f7-71e0d4810ed6\") " Apr 16 20:02:24.476670 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476636 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-config" (OuterVolumeSpecName: "console-config") pod "b8954c86-0e88-4680-a5f7-71e0d4810ed6" (UID: "b8954c86-0e88-4680-a5f7-71e0d4810ed6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:24.476795 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476643 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b8954c86-0e88-4680-a5f7-71e0d4810ed6" (UID: "b8954c86-0e88-4680-a5f7-71e0d4810ed6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:24.476863 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476797 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-service-ca" (OuterVolumeSpecName: "service-ca") pod "b8954c86-0e88-4680-a5f7-71e0d4810ed6" (UID: "b8954c86-0e88-4680-a5f7-71e0d4810ed6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:24.476955 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.476930 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b8954c86-0e88-4680-a5f7-71e0d4810ed6" (UID: "b8954c86-0e88-4680-a5f7-71e0d4810ed6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:24.478572 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.478546 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b8954c86-0e88-4680-a5f7-71e0d4810ed6" (UID: "b8954c86-0e88-4680-a5f7-71e0d4810ed6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:02:24.478667 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.478590 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b8954c86-0e88-4680-a5f7-71e0d4810ed6" (UID: "b8954c86-0e88-4680-a5f7-71e0d4810ed6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:02:24.478667 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.478647 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8954c86-0e88-4680-a5f7-71e0d4810ed6-kube-api-access-2ntql" (OuterVolumeSpecName: "kube-api-access-2ntql") pod "b8954c86-0e88-4680-a5f7-71e0d4810ed6" (UID: "b8954c86-0e88-4680-a5f7-71e0d4810ed6"). InnerVolumeSpecName "kube-api-access-2ntql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:02:24.577944 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.577901 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:02:24.577944 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.577938 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ntql\" (UniqueName: \"kubernetes.io/projected/b8954c86-0e88-4680-a5f7-71e0d4810ed6-kube-api-access-2ntql\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:02:24.577944 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.577950 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-oauth-config\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:02:24.578191 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.577959 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-service-ca\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:02:24.578191 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.577967 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8954c86-0e88-4680-a5f7-71e0d4810ed6-console-serving-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:02:24.578191 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.577976 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-oauth-serving-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:02:24.578191 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:24.577985 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8954c86-0e88-4680-a5f7-71e0d4810ed6-trusted-ca-bundle\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:02:25.154250 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.154219 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69f599566b-xl9vs_b8954c86-0e88-4680-a5f7-71e0d4810ed6/console/0.log" Apr 16 20:02:25.154770 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.154262 2569 generic.go:358] "Generic (PLEG): container finished" podID="b8954c86-0e88-4680-a5f7-71e0d4810ed6" containerID="1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932" exitCode=2 Apr 16 20:02:25.154770 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.154294 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f599566b-xl9vs" event={"ID":"b8954c86-0e88-4680-a5f7-71e0d4810ed6","Type":"ContainerDied","Data":"1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932"} Apr 16 20:02:25.154770 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.154344 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f599566b-xl9vs" event={"ID":"b8954c86-0e88-4680-a5f7-71e0d4810ed6","Type":"ContainerDied","Data":"452dc19e3987a033062954f5f61d48f7149411410d7291e2a1bb0806df6e0eb7"} Apr 16 20:02:25.154770 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.154360 2569 scope.go:117] "RemoveContainer" containerID="1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932" Apr 16 20:02:25.154770 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.154380 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f599566b-xl9vs" Apr 16 20:02:25.163968 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.163946 2569 scope.go:117] "RemoveContainer" containerID="1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932" Apr 16 20:02:25.164262 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:02:25.164241 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932\": container with ID starting with 1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932 not found: ID does not exist" containerID="1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932" Apr 16 20:02:25.164312 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.164271 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932"} err="failed to get container status \"1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932\": rpc error: code = NotFound desc = could not find container \"1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932\": container with ID starting with 1b5578adc71fd18079722d3e0e9ef2322114ee3e3aa3350c7a207b0a04fb6932 not found: ID does not exist" Apr 16 20:02:25.177004 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.176972 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69f599566b-xl9vs"] Apr 16 20:02:25.181257 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:25.181229 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69f599566b-xl9vs"] Apr 16 20:02:26.184565 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:26.184528 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8954c86-0e88-4680-a5f7-71e0d4810ed6" path="/var/lib/kubelet/pods/b8954c86-0e88-4680-a5f7-71e0d4810ed6/volumes" Apr 16 20:02:30.398488 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.398453 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4"] Apr 16 20:02:30.398976 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.398891 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8954c86-0e88-4680-a5f7-71e0d4810ed6" containerName="console" Apr 16 20:02:30.398976 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.398904 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8954c86-0e88-4680-a5f7-71e0d4810ed6" containerName="console" Apr 16 20:02:30.399053 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.398986 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8954c86-0e88-4680-a5f7-71e0d4810ed6" containerName="console" Apr 16 20:02:30.402113 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.402089 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:02:30.403813 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.403787 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g6xwp\"" Apr 16 20:02:30.415474 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.415444 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4"] Apr 16 20:02:30.533086 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.533048 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28fd4b7f-3939-4849-9dc5-7f5bf2e6176d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4\" (UID: \"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:02:30.634428 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.634385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28fd4b7f-3939-4849-9dc5-7f5bf2e6176d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4\" (UID: \"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:02:30.634851 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.634827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28fd4b7f-3939-4849-9dc5-7f5bf2e6176d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4\" (UID: \"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:02:30.712494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.712442 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:02:30.843197 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:30.843048 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4"] Apr 16 20:02:30.846115 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:02:30.846084 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28fd4b7f_3939_4849_9dc5_7f5bf2e6176d.slice/crio-1a82ae9fa616e92b77e959ad631155e76aa60b7f5435708354629415d04a6d2b WatchSource:0}: Error finding container 1a82ae9fa616e92b77e959ad631155e76aa60b7f5435708354629415d04a6d2b: Status 404 returned error can't find the container with id 1a82ae9fa616e92b77e959ad631155e76aa60b7f5435708354629415d04a6d2b Apr 16 20:02:31.178744 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:31.178705 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" event={"ID":"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d","Type":"ContainerStarted","Data":"1a82ae9fa616e92b77e959ad631155e76aa60b7f5435708354629415d04a6d2b"} Apr 16 20:02:35.194741 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:35.194703 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" event={"ID":"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d","Type":"ContainerStarted","Data":"edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9"} Apr 16 20:02:39.210486 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:39.210449 2569 generic.go:358] "Generic (PLEG): container finished" podID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerID="edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9" exitCode=0 Apr 16 20:02:39.210889 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:39.210505 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" event={"ID":"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d","Type":"ContainerDied","Data":"edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9"} Apr 16 20:02:52.266087 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:52.266045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" event={"ID":"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d","Type":"ContainerStarted","Data":"456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7"} Apr 16 20:02:54.278623 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:54.278554 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" event={"ID":"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d","Type":"ContainerStarted","Data":"fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095"} Apr 16 20:02:54.279049 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:54.278807 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:02:54.280287 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:54.280254 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:02:54.295100 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:54.295039 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podStartSLOduration=1.5412611200000002 podStartE2EDuration="24.29502259s" podCreationTimestamp="2026-04-16 20:02:30 +0000 UTC" firstStartedPulling="2026-04-16 20:02:30.847960105 +0000 UTC m=+515.262534945" lastFinishedPulling="2026-04-16 20:02:53.601721588 +0000 UTC m=+538.016296415" observedRunningTime="2026-04-16 20:02:54.292779399 +0000 UTC m=+538.707354248" watchObservedRunningTime="2026-04-16 20:02:54.29502259 +0000 UTC m=+538.709597438" Apr 16 20:02:55.282458 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:55.282426 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:02:55.282917 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:55.282661 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:02:55.283600 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:55.283547 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:56.286267 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:56.286218 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:02:56.287180 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:02:56.287153 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:06.286241 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:06.286183 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:03:06.286800 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:06.286680 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:16.286358 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:16.286315 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:03:16.286839 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:16.286818 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:26.286306 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:26.286261 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:03:26.286770 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:26.286714 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:36.286195 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:36.286145 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:03:36.286728 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:36.286701 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:46.287127 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:46.287075 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:03:46.287709 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:46.287556 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:56.062000 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:56.061967 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 20:03:56.063089 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:56.063070 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 20:03:56.286387 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:56.286337 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:03:56.286728 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:03:56.286690 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:04.184521 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:04.184492 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:04:04.184911 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:04.184543 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:04:15.519487 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.519397 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4"] Apr 16 20:04:15.522019 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.519869 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" containerID="cri-o://456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7" gracePeriod=30 Apr 16 20:04:15.522019 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.519948 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" containerID="cri-o://fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095" gracePeriod=30 Apr 16 20:04:15.601670 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.601634 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf"] Apr 16 20:04:15.605323 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.605294 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:04:15.613241 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.613213 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf"] Apr 16 20:04:15.642418 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.642381 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7"] Apr 16 20:04:15.646372 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.646346 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:04:15.652123 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.652094 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7"] Apr 16 20:04:15.686792 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.686759 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d234ad-23e4-4aa7-93d8-78e8d575e7dd-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7\" (UID: \"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:04:15.788124 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.788014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d8886c5-d968-46cb-af5e-3c4675d104f7-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf\" (UID: \"1d8886c5-d968-46cb-af5e-3c4675d104f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:04:15.788293 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.788212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d234ad-23e4-4aa7-93d8-78e8d575e7dd-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7\" (UID: \"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:04:15.788566 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.788548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d234ad-23e4-4aa7-93d8-78e8d575e7dd-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7\" (UID: \"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:04:15.888964 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.888923 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d8886c5-d968-46cb-af5e-3c4675d104f7-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf\" (UID: \"1d8886c5-d968-46cb-af5e-3c4675d104f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:04:15.889333 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.889307 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d8886c5-d968-46cb-af5e-3c4675d104f7-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf\" (UID: \"1d8886c5-d968-46cb-af5e-3c4675d104f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:04:15.916881 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.916840 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:04:15.959980 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:15.959942 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:04:16.053761 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:16.053693 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf"] Apr 16 20:04:16.056366 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:04:16.056260 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice/crio-df320b30632e530fc3183174957a858bd37d004dea8d714ced9b120222801ea7 WatchSource:0}: Error finding container df320b30632e530fc3183174957a858bd37d004dea8d714ced9b120222801ea7: Status 404 returned error can't find the container with id df320b30632e530fc3183174957a858bd37d004dea8d714ced9b120222801ea7 Apr 16 20:04:16.058880 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:16.058858 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:04:16.104844 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:16.104810 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7"] Apr 16 20:04:16.108859 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:04:16.108812 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice/crio-6e42ea70e43c082a02e2aa958022bb7ef25613e220c1e4ebdbc3190b2a9af2a0 WatchSource:0}: Error finding container 6e42ea70e43c082a02e2aa958022bb7ef25613e220c1e4ebdbc3190b2a9af2a0: Status 404 returned error can't find the container with id 6e42ea70e43c082a02e2aa958022bb7ef25613e220c1e4ebdbc3190b2a9af2a0 Apr 16 20:04:16.583989 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:16.583951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" event={"ID":"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd","Type":"ContainerStarted","Data":"2274c59afc42ce78c4e341d2e1fa3589826ccfeb8deb44fbb99e247867bcc017"} Apr 16 20:04:16.583989 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:16.583996 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" event={"ID":"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd","Type":"ContainerStarted","Data":"6e42ea70e43c082a02e2aa958022bb7ef25613e220c1e4ebdbc3190b2a9af2a0"} Apr 16 20:04:16.585563 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:16.585535 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" event={"ID":"1d8886c5-d968-46cb-af5e-3c4675d104f7","Type":"ContainerStarted","Data":"d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845"} Apr 16 20:04:16.585563 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:16.585566 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" event={"ID":"1d8886c5-d968-46cb-af5e-3c4675d104f7","Type":"ContainerStarted","Data":"df320b30632e530fc3183174957a858bd37d004dea8d714ced9b120222801ea7"} Apr 16 20:04:20.604070 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:20.604031 2569 generic.go:358] "Generic (PLEG): container finished" podID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerID="2274c59afc42ce78c4e341d2e1fa3589826ccfeb8deb44fbb99e247867bcc017" exitCode=0 Apr 16 20:04:20.604511 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:20.604110 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" event={"ID":"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd","Type":"ContainerDied","Data":"2274c59afc42ce78c4e341d2e1fa3589826ccfeb8deb44fbb99e247867bcc017"} Apr 16 20:04:20.606137 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:20.606115 2569 generic.go:358] "Generic (PLEG): container finished" podID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerID="456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7" exitCode=0 Apr 16 20:04:20.606207 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:20.606170 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" event={"ID":"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d","Type":"ContainerDied","Data":"456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7"} Apr 16 20:04:20.607633 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:20.607610 2569 generic.go:358] "Generic (PLEG): container finished" podID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerID="d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845" exitCode=0 Apr 16 20:04:20.607758 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:20.607684 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" event={"ID":"1d8886c5-d968-46cb-af5e-3c4675d104f7","Type":"ContainerDied","Data":"d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845"} Apr 16 20:04:21.621277 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:21.621236 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" event={"ID":"1d8886c5-d968-46cb-af5e-3c4675d104f7","Type":"ContainerStarted","Data":"c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1"} Apr 16 20:04:21.621768 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:21.621678 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:04:21.623309 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:21.623274 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:04:21.636550 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:21.636476 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podStartSLOduration=6.636460298 podStartE2EDuration="6.636460298s" podCreationTimestamp="2026-04-16 20:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:04:21.634831087 +0000 UTC m=+626.049405957" watchObservedRunningTime="2026-04-16 20:04:21.636460298 +0000 UTC m=+626.051035146" Apr 16 20:04:22.626590 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:22.626532 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:04:24.179984 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:24.179933 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:04:24.180389 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:24.180296 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:32.626921 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:32.626875 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:04:34.179514 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:34.179463 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:04:34.180192 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:34.180160 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:40.699473 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:40.699432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" event={"ID":"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd","Type":"ContainerStarted","Data":"d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43"} Apr 16 20:04:40.699917 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:40.699792 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:04:40.701172 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:40.701142 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:04:40.714166 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:40.714115 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podStartSLOduration=6.514618739 podStartE2EDuration="25.7140995s" podCreationTimestamp="2026-04-16 20:04:15 +0000 UTC" firstStartedPulling="2026-04-16 20:04:20.60568797 +0000 UTC m=+625.020262799" lastFinishedPulling="2026-04-16 20:04:39.805168734 +0000 UTC m=+644.219743560" observedRunningTime="2026-04-16 20:04:40.712473102 +0000 UTC m=+645.127047951" watchObservedRunningTime="2026-04-16 20:04:40.7140995 +0000 UTC m=+645.128674348" Apr 16 20:04:41.708995 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:41.708954 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:04:42.626964 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:42.626911 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:04:44.179606 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:44.179532 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:04:44.180034 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:44.179934 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:04:44.184018 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:44.183994 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:04:44.184170 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:44.184040 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:04:46.190110 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.190084 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:04:46.265298 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.265252 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28fd4b7f-3939-4849-9dc5-7f5bf2e6176d-kserve-provision-location\") pod \"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d\" (UID: \"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d\") " Apr 16 20:04:46.265806 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.265646 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28fd4b7f-3939-4849-9dc5-7f5bf2e6176d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" (UID: "28fd4b7f-3939-4849-9dc5-7f5bf2e6176d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:04:46.366022 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.365973 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28fd4b7f-3939-4849-9dc5-7f5bf2e6176d-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:04:46.727930 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.727889 2569 generic.go:358] "Generic (PLEG): container finished" podID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerID="fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095" exitCode=0 Apr 16 20:04:46.728100 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.727963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" event={"ID":"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d","Type":"ContainerDied","Data":"fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095"} Apr 16 20:04:46.728100 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.727975 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" Apr 16 20:04:46.728100 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.728012 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4" event={"ID":"28fd4b7f-3939-4849-9dc5-7f5bf2e6176d","Type":"ContainerDied","Data":"1a82ae9fa616e92b77e959ad631155e76aa60b7f5435708354629415d04a6d2b"} Apr 16 20:04:46.728100 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.728029 2569 scope.go:117] "RemoveContainer" containerID="fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095" Apr 16 20:04:46.737921 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.737901 2569 scope.go:117] "RemoveContainer" containerID="456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7" Apr 16 20:04:46.746413 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.746394 2569 scope.go:117] "RemoveContainer" containerID="edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9" Apr 16 20:04:46.750068 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.750037 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4"] Apr 16 20:04:46.755034 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.755008 2569 scope.go:117] "RemoveContainer" containerID="fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095" Apr 16 20:04:46.755348 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:04:46.755329 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095\": container with ID starting with fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095 not found: ID does not exist" containerID="fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095" Apr 16 20:04:46.755437 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.755360 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095"} err="failed to get container status \"fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095\": rpc error: code = NotFound desc = could not find container \"fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095\": container with ID starting with fb93a359797586a43a423e781f8221cc45eb008397e36099cae71d2dc0c13095 not found: ID does not exist" Apr 16 20:04:46.755437 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.755386 2569 scope.go:117] "RemoveContainer" containerID="456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7" Apr 16 20:04:46.755437 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.755424 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-1ce5e-predictor-8bcd69f4d-xj8t4"] Apr 16 20:04:46.755659 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:04:46.755639 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7\": container with ID starting with 456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7 not found: ID does not exist" containerID="456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7" Apr 16 20:04:46.755706 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.755665 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7"} err="failed to get container status \"456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7\": rpc error: code = NotFound desc = could not find container \"456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7\": container with ID starting with 456138bf2558f93fb2620773ce2982a422bcffade3cdef426a1f5d8aaa4269f7 not found: ID does not exist" Apr 16 20:04:46.755706 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.755682 2569 scope.go:117] "RemoveContainer" containerID="edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9" Apr 16 20:04:46.755947 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:04:46.755930 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9\": container with ID starting with edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9 not found: ID does not exist" containerID="edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9" Apr 16 20:04:46.756001 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:46.755951 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9"} err="failed to get container status \"edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9\": rpc error: code = NotFound desc = could not find container \"edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9\": container with ID starting with edf55ef864ae3adfab2b14b3615c4bf74383dffe357f6f4c4b6fae34761669b9 not found: ID does not exist" Apr 16 20:04:48.184831 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:48.184795 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" path="/var/lib/kubelet/pods/28fd4b7f-3939-4849-9dc5-7f5bf2e6176d/volumes" Apr 16 20:04:51.709114 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:51.709066 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:04:52.626770 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:04:52.626724 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:05:01.709594 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:01.709541 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:05:02.627496 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:02.627450 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:05:11.708772 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:11.708721 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:05:12.626879 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:12.626827 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:05:21.708877 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:21.708829 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:05:22.626482 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:22.626438 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:05:31.708606 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:31.708536 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:05:32.627497 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:32.627464 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:05:41.710443 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:41.710410 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:05:45.678714 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.678628 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82"] Apr 16 20:05:45.679061 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.679012 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" Apr 16 20:05:45.679061 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.679023 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" Apr 16 20:05:45.679061 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.679047 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="storage-initializer" Apr 16 20:05:45.679061 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.679053 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="storage-initializer" Apr 16 20:05:45.679061 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.679063 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" Apr 16 20:05:45.679221 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.679069 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" Apr 16 20:05:45.679221 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.679139 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="kserve-container" Apr 16 20:05:45.679221 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.679151 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="28fd4b7f-3939-4849-9dc5-7f5bf2e6176d" containerName="agent" Apr 16 20:05:45.682369 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.682350 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:45.684366 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.684343 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-c600f-serving-cert\"" Apr 16 20:05:45.684467 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.684364 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:05:45.684467 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.684375 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-c600f-kube-rbac-proxy-sar-config\"" Apr 16 20:05:45.691110 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.691083 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82"] Apr 16 20:05:45.788886 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.788840 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf2465e4-3b45-45a9-9501-8c066f2aca25-openshift-service-ca-bundle\") pod \"model-chainer-raw-c600f-6f845876f-ptr82\" (UID: \"bf2465e4-3b45-45a9-9501-8c066f2aca25\") " pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:45.789047 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.788908 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2465e4-3b45-45a9-9501-8c066f2aca25-proxy-tls\") pod \"model-chainer-raw-c600f-6f845876f-ptr82\" (UID: \"bf2465e4-3b45-45a9-9501-8c066f2aca25\") " pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:45.890108 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.890068 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf2465e4-3b45-45a9-9501-8c066f2aca25-openshift-service-ca-bundle\") pod \"model-chainer-raw-c600f-6f845876f-ptr82\" (UID: \"bf2465e4-3b45-45a9-9501-8c066f2aca25\") " pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:45.890286 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.890119 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2465e4-3b45-45a9-9501-8c066f2aca25-proxy-tls\") pod \"model-chainer-raw-c600f-6f845876f-ptr82\" (UID: \"bf2465e4-3b45-45a9-9501-8c066f2aca25\") " pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:45.890793 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.890767 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf2465e4-3b45-45a9-9501-8c066f2aca25-openshift-service-ca-bundle\") pod \"model-chainer-raw-c600f-6f845876f-ptr82\" (UID: \"bf2465e4-3b45-45a9-9501-8c066f2aca25\") " pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:45.892600 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.892566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2465e4-3b45-45a9-9501-8c066f2aca25-proxy-tls\") pod \"model-chainer-raw-c600f-6f845876f-ptr82\" (UID: \"bf2465e4-3b45-45a9-9501-8c066f2aca25\") " pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:45.994549 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:45.994440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:46.121877 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:46.121850 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82"] Apr 16 20:05:46.124121 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:05:46.124081 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2465e4_3b45_45a9_9501_8c066f2aca25.slice/crio-f64b0beb2951ebba3feed38554a63851108a44b1becb5079fb6e54e0ecfcb0e6 WatchSource:0}: Error finding container f64b0beb2951ebba3feed38554a63851108a44b1becb5079fb6e54e0ecfcb0e6: Status 404 returned error can't find the container with id f64b0beb2951ebba3feed38554a63851108a44b1becb5079fb6e54e0ecfcb0e6 Apr 16 20:05:46.951210 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:46.951160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" event={"ID":"bf2465e4-3b45-45a9-9501-8c066f2aca25","Type":"ContainerStarted","Data":"f64b0beb2951ebba3feed38554a63851108a44b1becb5079fb6e54e0ecfcb0e6"} Apr 16 20:05:48.960889 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:48.960778 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" event={"ID":"bf2465e4-3b45-45a9-9501-8c066f2aca25","Type":"ContainerStarted","Data":"a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6"} Apr 16 20:05:48.961271 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:48.960885 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:48.976192 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:48.976144 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" podStartSLOduration=1.435654716 podStartE2EDuration="3.976130767s" podCreationTimestamp="2026-04-16 20:05:45 +0000 UTC" firstStartedPulling="2026-04-16 20:05:46.126311716 +0000 UTC m=+710.540886542" lastFinishedPulling="2026-04-16 20:05:48.666787768 +0000 UTC m=+713.081362593" observedRunningTime="2026-04-16 20:05:48.97537528 +0000 UTC m=+713.389950129" watchObservedRunningTime="2026-04-16 20:05:48.976130767 +0000 UTC m=+713.390705614" Apr 16 20:05:54.970620 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:54.970559 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:05:55.734542 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:55.734496 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82"] Apr 16 20:05:55.734803 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:55.734751 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" containerID="cri-o://a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6" gracePeriod=30 Apr 16 20:05:55.911799 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:55.911756 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf"] Apr 16 20:05:55.912038 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:55.912017 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" containerID="cri-o://c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1" gracePeriod=30 Apr 16 20:05:55.948404 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:55.948370 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4"] Apr 16 20:05:55.952617 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:55.952598 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:05:55.961433 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:55.961404 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4"] Apr 16 20:05:56.000334 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.000260 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h"] Apr 16 20:05:56.003788 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.003770 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:05:56.010482 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.010451 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h"] Apr 16 20:05:56.086024 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.085987 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5045f7e4-d5de-4bec-af8c-4e46ccdd0a40-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4\" (UID: \"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:05:56.086203 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.086115 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60663a17-ce51-43a0-b0f3-2faf1d036f53-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h\" (UID: \"60663a17-ce51-43a0-b0f3-2faf1d036f53\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:05:56.112481 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.112438 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7"] Apr 16 20:05:56.112906 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.112860 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" containerID="cri-o://d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43" gracePeriod=30 Apr 16 20:05:56.188274 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.186903 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60663a17-ce51-43a0-b0f3-2faf1d036f53-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h\" (UID: \"60663a17-ce51-43a0-b0f3-2faf1d036f53\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:05:56.188274 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.186984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5045f7e4-d5de-4bec-af8c-4e46ccdd0a40-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4\" (UID: \"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:05:56.188274 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.187394 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5045f7e4-d5de-4bec-af8c-4e46ccdd0a40-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4\" (UID: \"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:05:56.188274 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.187563 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60663a17-ce51-43a0-b0f3-2faf1d036f53-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h\" (UID: \"60663a17-ce51-43a0-b0f3-2faf1d036f53\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:05:56.264730 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.264635 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:05:56.315666 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.315643 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:05:56.408951 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.408917 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4"] Apr 16 20:05:56.458739 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.458707 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h"] Apr 16 20:05:56.990947 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.990899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" event={"ID":"60663a17-ce51-43a0-b0f3-2faf1d036f53","Type":"ContainerStarted","Data":"8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3"} Apr 16 20:05:56.990947 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.990953 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" event={"ID":"60663a17-ce51-43a0-b0f3-2faf1d036f53","Type":"ContainerStarted","Data":"f00c9ee055002276a69c9d202948c9f92115fab4ea950b7c32b203532907aa1f"} Apr 16 20:05:56.992454 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.992422 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" event={"ID":"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40","Type":"ContainerStarted","Data":"6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd"} Apr 16 20:05:56.992610 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:56.992458 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" event={"ID":"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40","Type":"ContainerStarted","Data":"61459d1d38437067153040e86740a5f34ee41bf758c5ba33fc0fe216fe628bea"} Apr 16 20:05:59.969013 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:05:59.968968 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:00.010925 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:00.010683 2569 generic.go:358] "Generic (PLEG): container finished" podID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerID="d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43" exitCode=0 Apr 16 20:06:00.010925 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:00.010873 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" event={"ID":"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd","Type":"ContainerDied","Data":"d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43"} Apr 16 20:06:00.067919 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:00.067893 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:06:00.228777 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:00.228742 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d234ad-23e4-4aa7-93d8-78e8d575e7dd-kserve-provision-location\") pod \"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd\" (UID: \"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd\") " Apr 16 20:06:00.229071 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:00.229047 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d234ad-23e4-4aa7-93d8-78e8d575e7dd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" (UID: "c7d234ad-23e4-4aa7-93d8-78e8d575e7dd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:06:00.329604 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:00.329543 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d234ad-23e4-4aa7-93d8-78e8d575e7dd-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:06:00.867711 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:00.867684 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:06:01.015976 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.015889 2569 generic.go:358] "Generic (PLEG): container finished" podID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerID="6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd" exitCode=0 Apr 16 20:06:01.016375 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.015968 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" event={"ID":"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40","Type":"ContainerDied","Data":"6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd"} Apr 16 20:06:01.017417 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.017389 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" Apr 16 20:06:01.017525 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.017384 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7" event={"ID":"c7d234ad-23e4-4aa7-93d8-78e8d575e7dd","Type":"ContainerDied","Data":"6e42ea70e43c082a02e2aa958022bb7ef25613e220c1e4ebdbc3190b2a9af2a0"} Apr 16 20:06:01.017525 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.017507 2569 scope.go:117] "RemoveContainer" containerID="d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43" Apr 16 20:06:01.018778 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.018752 2569 generic.go:358] "Generic (PLEG): container finished" podID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerID="8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3" exitCode=0 Apr 16 20:06:01.018895 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.018850 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" event={"ID":"60663a17-ce51-43a0-b0f3-2faf1d036f53","Type":"ContainerDied","Data":"8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3"} Apr 16 20:06:01.020953 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.020932 2569 generic.go:358] "Generic (PLEG): container finished" podID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerID="c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1" exitCode=0 Apr 16 20:06:01.021055 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.020996 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" Apr 16 20:06:01.021055 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.021006 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" event={"ID":"1d8886c5-d968-46cb-af5e-3c4675d104f7","Type":"ContainerDied","Data":"c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1"} Apr 16 20:06:01.021055 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.021036 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf" event={"ID":"1d8886c5-d968-46cb-af5e-3c4675d104f7","Type":"ContainerDied","Data":"df320b30632e530fc3183174957a858bd37d004dea8d714ced9b120222801ea7"} Apr 16 20:06:01.026701 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.026563 2569 scope.go:117] "RemoveContainer" containerID="2274c59afc42ce78c4e341d2e1fa3589826ccfeb8deb44fbb99e247867bcc017" Apr 16 20:06:01.034900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.034875 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d8886c5-d968-46cb-af5e-3c4675d104f7-kserve-provision-location\") pod \"1d8886c5-d968-46cb-af5e-3c4675d104f7\" (UID: \"1d8886c5-d968-46cb-af5e-3c4675d104f7\") " Apr 16 20:06:01.035193 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.035166 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d8886c5-d968-46cb-af5e-3c4675d104f7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1d8886c5-d968-46cb-af5e-3c4675d104f7" (UID: "1d8886c5-d968-46cb-af5e-3c4675d104f7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:06:01.039481 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.039458 2569 scope.go:117] "RemoveContainer" containerID="c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1" Apr 16 20:06:01.060830 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.060695 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7"] Apr 16 20:06:01.060885 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.060871 2569 scope.go:117] "RemoveContainer" containerID="d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845" Apr 16 20:06:01.067176 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.067144 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-c600f-predictor-695c695d99-nbrm7"] Apr 16 20:06:01.080049 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.080018 2569 scope.go:117] "RemoveContainer" containerID="c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1" Apr 16 20:06:01.080393 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:06:01.080369 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1\": container with ID starting with c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1 not found: ID does not exist" containerID="c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1" Apr 16 20:06:01.080490 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.080408 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1"} err="failed to get container status \"c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1\": rpc error: code = NotFound desc = could not find container \"c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1\": container with ID starting with c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1 not found: ID does not exist" Apr 16 20:06:01.080490 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.080433 2569 scope.go:117] "RemoveContainer" containerID="d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845" Apr 16 20:06:01.080750 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:06:01.080727 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845\": container with ID starting with d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845 not found: ID does not exist" containerID="d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845" Apr 16 20:06:01.080832 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.080755 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845"} err="failed to get container status \"d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845\": rpc error: code = NotFound desc = could not find container \"d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845\": container with ID starting with d422b0e830602601e0659e3a768b141c9d4b58e951f51f6552b351e0bc5d5845 not found: ID does not exist" Apr 16 20:06:01.136132 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.136080 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d8886c5-d968-46cb-af5e-3c4675d104f7-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:06:01.345357 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.345319 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf"] Apr 16 20:06:01.348946 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:01.348916 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf"] Apr 16 20:06:02.025873 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.025838 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" event={"ID":"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40","Type":"ContainerStarted","Data":"1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e"} Apr 16 20:06:02.026352 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.026175 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:06:02.027979 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.027949 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:06:02.028260 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.028239 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" event={"ID":"60663a17-ce51-43a0-b0f3-2faf1d036f53","Type":"ContainerStarted","Data":"06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142"} Apr 16 20:06:02.028571 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.028552 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:06:02.029774 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.029750 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:06:02.043764 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.043712 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podStartSLOduration=7.043696775 podStartE2EDuration="7.043696775s" podCreationTimestamp="2026-04-16 20:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:06:02.041602801 +0000 UTC m=+726.456177645" watchObservedRunningTime="2026-04-16 20:06:02.043696775 +0000 UTC m=+726.458271623" Apr 16 20:06:02.059476 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.059424 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podStartSLOduration=7.05940876 podStartE2EDuration="7.05940876s" podCreationTimestamp="2026-04-16 20:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:06:02.057375454 +0000 UTC m=+726.471950298" watchObservedRunningTime="2026-04-16 20:06:02.05940876 +0000 UTC m=+726.473983607" Apr 16 20:06:02.183862 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.183831 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" path="/var/lib/kubelet/pods/1d8886c5-d968-46cb-af5e-3c4675d104f7/volumes" Apr 16 20:06:02.184210 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:02.184197 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" path="/var/lib/kubelet/pods/c7d234ad-23e4-4aa7-93d8-78e8d575e7dd/volumes" Apr 16 20:06:03.033346 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:03.033303 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:06:03.033763 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:03.033302 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:06:04.969036 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:04.968997 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:09.968018 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:09.967980 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:09.968517 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:09.968092 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:06:13.034275 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:13.034231 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:06:13.034779 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:13.034240 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:06:14.968652 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:14.968602 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:19.968257 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:19.968215 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:20.652324 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:06:20.652284 2569 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/af7774bda1eafb7b6d065c8055163a077595841c68bca411157cdd5632ae59e6/diff" to get inode usage: stat /var/lib/containers/storage/overlay/af7774bda1eafb7b6d065c8055163a077595841c68bca411157cdd5632ae59e6/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf_1d8886c5-d968-46cb-af5e-3c4675d104f7/kserve-container/0.log" to get inode usage: stat /var/log/pods/kserve-ci-e2e-test_isvc-sklearn-graph-raw-c600f-predictor-7f8b8cc8fd-4m4wf_1d8886c5-d968-46cb-af5e-3c4675d104f7/kserve-container/0.log: no such file or directory Apr 16 20:06:23.033597 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:23.033542 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:06:23.033978 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:23.033541 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:06:24.968271 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:24.968228 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:06:25.763723 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:06:25.763665 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice/crio-df320b30632e530fc3183174957a858bd37d004dea8d714ced9b120222801ea7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2465e4_3b45_45a9_9501_8c066f2aca25.slice/crio-a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2465e4_3b45_45a9_9501_8c066f2aca25.slice/crio-conmon-a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:06:25.763723 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:06:25.763669 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice/crio-df320b30632e530fc3183174957a858bd37d004dea8d714ced9b120222801ea7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice/crio-conmon-c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice/crio-conmon-d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice/crio-6e42ea70e43c082a02e2aa958022bb7ef25613e220c1e4ebdbc3190b2a9af2a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice/crio-d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice/crio-c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2465e4_3b45_45a9_9501_8c066f2aca25.slice/crio-conmon-a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice\": RecentStats: unable to find data in memory cache]" Apr 16 20:06:25.763984 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:06:25.763765 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice/crio-conmon-c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice/crio-6e42ea70e43c082a02e2aa958022bb7ef25613e220c1e4ebdbc3190b2a9af2a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice/crio-conmon-d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice/crio-df320b30632e530fc3183174957a858bd37d004dea8d714ced9b120222801ea7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2465e4_3b45_45a9_9501_8c066f2aca25.slice/crio-conmon-a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d234ad_23e4_4aa7_93d8_78e8d575e7dd.slice/crio-d3e09b69dbacd775b32b1d6fac5a8cb9ec5389d8512f4e11077186983a70af43.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8886c5_d968_46cb_af5e_3c4675d104f7.slice/crio-c68a04fc8d352043baa5aebe9d1a54e8b322f406d5d5b214d7a1fdccf5b038f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2465e4_3b45_45a9_9501_8c066f2aca25.slice/crio-a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:06:25.911129 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:25.911101 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:06:25.963565 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:25.963520 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2465e4-3b45-45a9-9501-8c066f2aca25-proxy-tls\") pod \"bf2465e4-3b45-45a9-9501-8c066f2aca25\" (UID: \"bf2465e4-3b45-45a9-9501-8c066f2aca25\") " Apr 16 20:06:25.963789 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:25.963745 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf2465e4-3b45-45a9-9501-8c066f2aca25-openshift-service-ca-bundle\") pod \"bf2465e4-3b45-45a9-9501-8c066f2aca25\" (UID: \"bf2465e4-3b45-45a9-9501-8c066f2aca25\") " Apr 16 20:06:25.964093 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:25.964061 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf2465e4-3b45-45a9-9501-8c066f2aca25-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bf2465e4-3b45-45a9-9501-8c066f2aca25" (UID: "bf2465e4-3b45-45a9-9501-8c066f2aca25"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:06:25.965975 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:25.965944 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf2465e4-3b45-45a9-9501-8c066f2aca25-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bf2465e4-3b45-45a9-9501-8c066f2aca25" (UID: "bf2465e4-3b45-45a9-9501-8c066f2aca25"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:06:26.065164 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.065072 2569 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf2465e4-3b45-45a9-9501-8c066f2aca25-openshift-service-ca-bundle\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:06:26.065164 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.065108 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2465e4-3b45-45a9-9501-8c066f2aca25-proxy-tls\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:06:26.121294 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.121259 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerID="a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6" exitCode=0 Apr 16 20:06:26.121459 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.121343 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" Apr 16 20:06:26.121459 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.121351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" event={"ID":"bf2465e4-3b45-45a9-9501-8c066f2aca25","Type":"ContainerDied","Data":"a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6"} Apr 16 20:06:26.121459 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.121390 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82" event={"ID":"bf2465e4-3b45-45a9-9501-8c066f2aca25","Type":"ContainerDied","Data":"f64b0beb2951ebba3feed38554a63851108a44b1becb5079fb6e54e0ecfcb0e6"} Apr 16 20:06:26.121459 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.121406 2569 scope.go:117] "RemoveContainer" containerID="a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6" Apr 16 20:06:26.130683 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.130656 2569 scope.go:117] "RemoveContainer" containerID="a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6" Apr 16 20:06:26.131048 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:06:26.131023 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6\": container with ID starting with a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6 not found: ID does not exist" containerID="a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6" Apr 16 20:06:26.131137 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.131065 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6"} err="failed to get container status \"a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6\": rpc error: code = NotFound desc = could not find container \"a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6\": container with ID starting with a67fe4d1b6559e3d70f97ff4564c9c830ecbcc5913876b0f49f99d0159ed7ef6 not found: ID does not exist" Apr 16 20:06:26.143210 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.143176 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82"] Apr 16 20:06:26.146519 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.146487 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c600f-6f845876f-ptr82"] Apr 16 20:06:26.184129 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:26.184096 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" path="/var/lib/kubelet/pods/bf2465e4-3b45-45a9-9501-8c066f2aca25/volumes" Apr 16 20:06:33.034043 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:33.033985 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:06:33.034541 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:33.033985 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:06:43.033468 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:43.033413 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:06:43.034087 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:43.033413 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:06:53.033963 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:53.033896 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:06:53.034494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:06:53.033896 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:07:03.033738 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:03.033674 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:07:03.034212 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:03.034141 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:07:13.034266 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:13.034233 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:07:36.205565 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:36.205524 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4"] Apr 16 20:07:36.206084 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:36.205906 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" containerID="cri-o://1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e" gracePeriod=30 Apr 16 20:07:36.319294 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:36.319259 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h"] Apr 16 20:07:36.319557 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:36.319526 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" containerID="cri-o://06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142" gracePeriod=30 Apr 16 20:07:40.263414 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.263387 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:07:40.335459 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.335351 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60663a17-ce51-43a0-b0f3-2faf1d036f53-kserve-provision-location\") pod \"60663a17-ce51-43a0-b0f3-2faf1d036f53\" (UID: \"60663a17-ce51-43a0-b0f3-2faf1d036f53\") " Apr 16 20:07:40.335713 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.335687 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60663a17-ce51-43a0-b0f3-2faf1d036f53-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "60663a17-ce51-43a0-b0f3-2faf1d036f53" (UID: "60663a17-ce51-43a0-b0f3-2faf1d036f53"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:07:40.392812 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.392778 2569 generic.go:358] "Generic (PLEG): container finished" podID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerID="06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142" exitCode=0 Apr 16 20:07:40.393043 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.392848 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" Apr 16 20:07:40.393043 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.392865 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" event={"ID":"60663a17-ce51-43a0-b0f3-2faf1d036f53","Type":"ContainerDied","Data":"06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142"} Apr 16 20:07:40.393043 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.392905 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h" event={"ID":"60663a17-ce51-43a0-b0f3-2faf1d036f53","Type":"ContainerDied","Data":"f00c9ee055002276a69c9d202948c9f92115fab4ea950b7c32b203532907aa1f"} Apr 16 20:07:40.393043 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.392921 2569 scope.go:117] "RemoveContainer" containerID="06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142" Apr 16 20:07:40.401392 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.401375 2569 scope.go:117] "RemoveContainer" containerID="8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3" Apr 16 20:07:40.409091 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.409070 2569 scope.go:117] "RemoveContainer" containerID="06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142" Apr 16 20:07:40.409370 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:07:40.409351 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142\": container with ID starting with 06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142 not found: ID does not exist" containerID="06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142" Apr 16 20:07:40.409443 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.409385 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142"} err="failed to get container status \"06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142\": rpc error: code = NotFound desc = could not find container \"06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142\": container with ID starting with 06da7890259af10e6e301a7c8015788d67bffe28efb0e2f45b82ff2520ea5142 not found: ID does not exist" Apr 16 20:07:40.409443 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.409411 2569 scope.go:117] "RemoveContainer" containerID="8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3" Apr 16 20:07:40.409864 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:07:40.409834 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3\": container with ID starting with 8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3 not found: ID does not exist" containerID="8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3" Apr 16 20:07:40.409957 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.409873 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3"} err="failed to get container status \"8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3\": rpc error: code = NotFound desc = could not find container \"8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3\": container with ID starting with 8e76e8e18a582416e48d76667ec1ede07018b38629f346030ae916f63c2bd0a3 not found: ID does not exist" Apr 16 20:07:40.412044 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.412021 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h"] Apr 16 20:07:40.415384 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.415358 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-78192-predictor-5dd8cfd66c-l7h2h"] Apr 16 20:07:40.435935 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:40.435897 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60663a17-ce51-43a0-b0f3-2faf1d036f53-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:07:41.062074 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.062050 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:07:41.142511 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.142413 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5045f7e4-d5de-4bec-af8c-4e46ccdd0a40-kserve-provision-location\") pod \"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40\" (UID: \"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40\") " Apr 16 20:07:41.142815 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.142791 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5045f7e4-d5de-4bec-af8c-4e46ccdd0a40-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" (UID: "5045f7e4-d5de-4bec-af8c-4e46ccdd0a40"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:07:41.243978 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.243936 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5045f7e4-d5de-4bec-af8c-4e46ccdd0a40-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:07:41.398925 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.398839 2569 generic.go:358] "Generic (PLEG): container finished" podID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerID="1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e" exitCode=0 Apr 16 20:07:41.398925 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.398909 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" Apr 16 20:07:41.399494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.398918 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" event={"ID":"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40","Type":"ContainerDied","Data":"1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e"} Apr 16 20:07:41.399494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.398952 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4" event={"ID":"5045f7e4-d5de-4bec-af8c-4e46ccdd0a40","Type":"ContainerDied","Data":"61459d1d38437067153040e86740a5f34ee41bf758c5ba33fc0fe216fe628bea"} Apr 16 20:07:41.399494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.398968 2569 scope.go:117] "RemoveContainer" containerID="1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e" Apr 16 20:07:41.407687 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.407664 2569 scope.go:117] "RemoveContainer" containerID="6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd" Apr 16 20:07:41.415447 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.415426 2569 scope.go:117] "RemoveContainer" containerID="1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e" Apr 16 20:07:41.415768 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:07:41.415737 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e\": container with ID starting with 1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e not found: ID does not exist" containerID="1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e" Apr 16 20:07:41.415872 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.415780 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e"} err="failed to get container status \"1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e\": rpc error: code = NotFound desc = could not find container \"1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e\": container with ID starting with 1260ae8f953f4aed31c10a08cfc17dfae45e3173f4dc043f42ff83008bb0503e not found: ID does not exist" Apr 16 20:07:41.415872 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.415805 2569 scope.go:117] "RemoveContainer" containerID="6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd" Apr 16 20:07:41.416072 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:07:41.416056 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd\": container with ID starting with 6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd not found: ID does not exist" containerID="6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd" Apr 16 20:07:41.416127 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.416076 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd"} err="failed to get container status \"6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd\": rpc error: code = NotFound desc = could not find container \"6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd\": container with ID starting with 6bcfdd0a3434ddd62f2090c4176c95337b0702c4c8935cd6d25a9a68d678b9cd not found: ID does not exist" Apr 16 20:07:41.419255 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.419229 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4"] Apr 16 20:07:41.423226 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:41.423200 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-78192-predictor-774df4866-th9q4"] Apr 16 20:07:42.183802 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:42.183763 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" path="/var/lib/kubelet/pods/5045f7e4-d5de-4bec-af8c-4e46ccdd0a40/volumes" Apr 16 20:07:42.184160 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:42.184145 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" path="/var/lib/kubelet/pods/60663a17-ce51-43a0-b0f3-2faf1d036f53/volumes" Apr 16 20:07:46.265692 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.265656 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm"] Apr 16 20:07:46.266064 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266023 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" Apr 16 20:07:46.266064 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266035 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" Apr 16 20:07:46.266064 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266046 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="storage-initializer" Apr 16 20:07:46.266064 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266052 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="storage-initializer" Apr 16 20:07:46.266064 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266058 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="storage-initializer" Apr 16 20:07:46.266064 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266064 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="storage-initializer" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266073 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266078 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266086 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266091 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266100 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="storage-initializer" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266105 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="storage-initializer" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266117 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266122 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266128 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266133 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266141 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="storage-initializer" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266146 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="storage-initializer" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266208 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf2465e4-3b45-45a9-9501-8c066f2aca25" containerName="model-chainer-raw-c600f" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266217 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="60663a17-ce51-43a0-b0f3-2faf1d036f53" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266223 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5045f7e4-d5de-4bec-af8c-4e46ccdd0a40" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266232 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d8886c5-d968-46cb-af5e-3c4675d104f7" containerName="kserve-container" Apr 16 20:07:46.266253 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.266238 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7d234ad-23e4-4aa7-93d8-78e8d575e7dd" containerName="kserve-container" Apr 16 20:07:46.269240 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.269214 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:07:46.270975 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.270950 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g6xwp\"" Apr 16 20:07:46.275860 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.275836 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm"] Apr 16 20:07:46.287761 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.287727 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a51e5e49-b020-4bca-98aa-7f0c02d441ca-kserve-provision-location\") pod \"isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm\" (UID: \"a51e5e49-b020-4bca-98aa-7f0c02d441ca\") " pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:07:46.388921 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.388879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a51e5e49-b020-4bca-98aa-7f0c02d441ca-kserve-provision-location\") pod \"isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm\" (UID: \"a51e5e49-b020-4bca-98aa-7f0c02d441ca\") " pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:07:46.389315 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.389288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a51e5e49-b020-4bca-98aa-7f0c02d441ca-kserve-provision-location\") pod \"isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm\" (UID: \"a51e5e49-b020-4bca-98aa-7f0c02d441ca\") " pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:07:46.581444 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.581346 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:07:46.714680 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:46.714654 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm"] Apr 16 20:07:46.716719 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:07:46.716691 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda51e5e49_b020_4bca_98aa_7f0c02d441ca.slice/crio-8d480e009133011c53a79048406dac02d7f811ce3139c95b3758caaf4ebca0c9 WatchSource:0}: Error finding container 8d480e009133011c53a79048406dac02d7f811ce3139c95b3758caaf4ebca0c9: Status 404 returned error can't find the container with id 8d480e009133011c53a79048406dac02d7f811ce3139c95b3758caaf4ebca0c9 Apr 16 20:07:47.423372 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:47.423329 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" event={"ID":"a51e5e49-b020-4bca-98aa-7f0c02d441ca","Type":"ContainerStarted","Data":"a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726"} Apr 16 20:07:47.423372 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:47.423377 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" event={"ID":"a51e5e49-b020-4bca-98aa-7f0c02d441ca","Type":"ContainerStarted","Data":"8d480e009133011c53a79048406dac02d7f811ce3139c95b3758caaf4ebca0c9"} Apr 16 20:07:51.437793 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:51.437757 2569 generic.go:358] "Generic (PLEG): container finished" podID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerID="a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726" exitCode=0 Apr 16 20:07:51.438209 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:51.437829 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" event={"ID":"a51e5e49-b020-4bca-98aa-7f0c02d441ca","Type":"ContainerDied","Data":"a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726"} Apr 16 20:07:52.443647 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:52.443609 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" event={"ID":"a51e5e49-b020-4bca-98aa-7f0c02d441ca","Type":"ContainerStarted","Data":"a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a"} Apr 16 20:07:52.443647 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:52.443653 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" event={"ID":"a51e5e49-b020-4bca-98aa-7f0c02d441ca","Type":"ContainerStarted","Data":"2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799"} Apr 16 20:07:52.444268 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:52.443932 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:07:52.445391 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:52.445361 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:07:52.458499 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:52.458443 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podStartSLOduration=6.458431045 podStartE2EDuration="6.458431045s" podCreationTimestamp="2026-04-16 20:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:07:52.458171436 +0000 UTC m=+836.872746284" watchObservedRunningTime="2026-04-16 20:07:52.458431045 +0000 UTC m=+836.873005892" Apr 16 20:07:53.448399 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:53.448365 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:07:53.448889 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:53.448535 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:07:53.449601 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:53.449561 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:07:54.452361 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:54.452322 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:07:54.452774 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:07:54.452750 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:04.452387 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:04.452340 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:08:04.452905 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:04.452878 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:14.452604 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:14.452520 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:08:14.453030 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:14.452986 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:24.452988 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:24.452932 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:08:24.453485 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:24.453414 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:34.452953 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:34.452888 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:08:34.453389 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:34.453363 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:44.453072 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:44.453021 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:08:44.453626 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:44.453552 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:54.452797 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:54.452744 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:08:54.453280 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:54.453237 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:56.089565 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:56.089531 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 20:08:56.092155 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:08:56.092132 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 20:09:04.452770 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:04.452741 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:09:04.453165 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:04.453028 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:09:11.568321 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.568280 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm"] Apr 16 20:09:11.568807 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.568626 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" containerID="cri-o://2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799" gracePeriod=30 Apr 16 20:09:11.568807 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.568722 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" containerID="cri-o://a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a" gracePeriod=30 Apr 16 20:09:11.582951 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.582919 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz"] Apr 16 20:09:11.586772 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.586751 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:09:11.593344 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.593310 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz"] Apr 16 20:09:11.740879 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.740834 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ee81949-ac46-4117-803f-7bddd653884c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz\" (UID: \"5ee81949-ac46-4117-803f-7bddd653884c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:09:11.841926 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.841826 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ee81949-ac46-4117-803f-7bddd653884c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz\" (UID: \"5ee81949-ac46-4117-803f-7bddd653884c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:09:11.842269 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.842242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ee81949-ac46-4117-803f-7bddd653884c-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz\" (UID: \"5ee81949-ac46-4117-803f-7bddd653884c\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:09:11.900375 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:11.900333 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:09:12.027670 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:12.027625 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz"] Apr 16 20:09:12.030731 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:09:12.030702 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee81949_ac46_4117_803f_7bddd653884c.slice/crio-97ebde78245bb58b9aeb1d81b49d0b55a5e42bca01c67c86165c7dd584ec0d25 WatchSource:0}: Error finding container 97ebde78245bb58b9aeb1d81b49d0b55a5e42bca01c67c86165c7dd584ec0d25: Status 404 returned error can't find the container with id 97ebde78245bb58b9aeb1d81b49d0b55a5e42bca01c67c86165c7dd584ec0d25 Apr 16 20:09:12.740921 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:12.740874 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" event={"ID":"5ee81949-ac46-4117-803f-7bddd653884c","Type":"ContainerStarted","Data":"3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9"} Apr 16 20:09:12.740921 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:12.740923 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" event={"ID":"5ee81949-ac46-4117-803f-7bddd653884c","Type":"ContainerStarted","Data":"97ebde78245bb58b9aeb1d81b49d0b55a5e42bca01c67c86165c7dd584ec0d25"} Apr 16 20:09:14.453237 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:14.453187 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:09:14.453716 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:14.453490 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:09:16.756882 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:16.756845 2569 generic.go:358] "Generic (PLEG): container finished" podID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerID="2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799" exitCode=0 Apr 16 20:09:16.757326 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:16.756917 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" event={"ID":"a51e5e49-b020-4bca-98aa-7f0c02d441ca","Type":"ContainerDied","Data":"2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799"} Apr 16 20:09:16.758196 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:16.758174 2569 generic.go:358] "Generic (PLEG): container finished" podID="5ee81949-ac46-4117-803f-7bddd653884c" containerID="3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9" exitCode=0 Apr 16 20:09:16.758283 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:16.758219 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" event={"ID":"5ee81949-ac46-4117-803f-7bddd653884c","Type":"ContainerDied","Data":"3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9"} Apr 16 20:09:16.759278 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:16.759262 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:09:17.763162 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:17.763123 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" event={"ID":"5ee81949-ac46-4117-803f-7bddd653884c","Type":"ContainerStarted","Data":"7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183"} Apr 16 20:09:17.763633 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:17.763405 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:09:17.764829 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:17.764802 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:09:17.778092 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:17.778039 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podStartSLOduration=6.778023041 podStartE2EDuration="6.778023041s" podCreationTimestamp="2026-04-16 20:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:09:17.776840475 +0000 UTC m=+922.191415326" watchObservedRunningTime="2026-04-16 20:09:17.778023041 +0000 UTC m=+922.192597889" Apr 16 20:09:18.767111 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:18.767063 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:09:24.452384 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:24.452331 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:09:24.452825 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:24.452698 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:09:28.767294 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:28.767244 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:09:34.452357 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:34.452300 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:09:34.452836 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:34.452451 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:09:34.452836 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:34.452620 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:09:34.452836 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:34.452775 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:09:38.768007 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:38.767961 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:09:41.721680 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.721654 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:09:41.808394 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.808352 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a51e5e49-b020-4bca-98aa-7f0c02d441ca-kserve-provision-location\") pod \"a51e5e49-b020-4bca-98aa-7f0c02d441ca\" (UID: \"a51e5e49-b020-4bca-98aa-7f0c02d441ca\") " Apr 16 20:09:41.808774 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.808752 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a51e5e49-b020-4bca-98aa-7f0c02d441ca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a51e5e49-b020-4bca-98aa-7f0c02d441ca" (UID: "a51e5e49-b020-4bca-98aa-7f0c02d441ca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:09:41.852115 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.852030 2569 generic.go:358] "Generic (PLEG): container finished" podID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerID="a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a" exitCode=0 Apr 16 20:09:41.852115 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.852089 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" event={"ID":"a51e5e49-b020-4bca-98aa-7f0c02d441ca","Type":"ContainerDied","Data":"a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a"} Apr 16 20:09:41.852115 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.852108 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" Apr 16 20:09:41.852115 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.852117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm" event={"ID":"a51e5e49-b020-4bca-98aa-7f0c02d441ca","Type":"ContainerDied","Data":"8d480e009133011c53a79048406dac02d7f811ce3139c95b3758caaf4ebca0c9"} Apr 16 20:09:41.852365 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.852133 2569 scope.go:117] "RemoveContainer" containerID="a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a" Apr 16 20:09:41.860862 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.860838 2569 scope.go:117] "RemoveContainer" containerID="2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799" Apr 16 20:09:41.868698 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.868674 2569 scope.go:117] "RemoveContainer" containerID="a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726" Apr 16 20:09:41.872617 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.872571 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm"] Apr 16 20:09:41.876019 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.875991 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-428b1-predictor-754c8b778b-8vjpm"] Apr 16 20:09:41.877586 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.877559 2569 scope.go:117] "RemoveContainer" containerID="a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a" Apr 16 20:09:41.877891 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:09:41.877869 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a\": container with ID starting with a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a not found: ID does not exist" containerID="a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a" Apr 16 20:09:41.877980 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.877906 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a"} err="failed to get container status \"a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a\": rpc error: code = NotFound desc = could not find container \"a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a\": container with ID starting with a35f3c0c0c2beb6b5dc3d35e81bb641852635c0cb52709bb0dc317c649866a9a not found: ID does not exist" Apr 16 20:09:41.877980 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.877934 2569 scope.go:117] "RemoveContainer" containerID="2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799" Apr 16 20:09:41.878178 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:09:41.878157 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799\": container with ID starting with 2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799 not found: ID does not exist" containerID="2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799" Apr 16 20:09:41.878227 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.878188 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799"} err="failed to get container status \"2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799\": rpc error: code = NotFound desc = could not find container \"2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799\": container with ID starting with 2a6bc439d19282677cc600d9c066292636667bffa77418b9e4f369d30e394799 not found: ID does not exist" Apr 16 20:09:41.878227 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.878219 2569 scope.go:117] "RemoveContainer" containerID="a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726" Apr 16 20:09:41.878409 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:09:41.878392 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726\": container with ID starting with a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726 not found: ID does not exist" containerID="a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726" Apr 16 20:09:41.878461 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.878417 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726"} err="failed to get container status \"a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726\": rpc error: code = NotFound desc = could not find container \"a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726\": container with ID starting with a1278b2df2e123856bcfc01d6374f8fdd4b1a6b56fefa3bc7ae7661b17ebd726 not found: ID does not exist" Apr 16 20:09:41.909967 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:41.909926 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a51e5e49-b020-4bca-98aa-7f0c02d441ca-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:09:42.184122 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:42.184088 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" path="/var/lib/kubelet/pods/a51e5e49-b020-4bca-98aa-7f0c02d441ca/volumes" Apr 16 20:09:48.767104 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:48.767057 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:09:58.767803 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:09:58.767752 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:10:08.768172 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:10:08.768117 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:10:18.767907 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:10:18.767849 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:10:24.179408 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:10:24.179361 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:10:34.179775 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:10:34.179728 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:10:44.179472 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:10:44.179429 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:10:54.179355 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:10:54.179314 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:11:04.179570 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:04.179527 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:11:14.179400 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:14.179357 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:11:24.179495 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:24.179451 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:11:25.179240 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:25.179195 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:11:35.180377 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:35.180338 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:11:41.762319 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.762282 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz"] Apr 16 20:11:41.762800 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.762566 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" containerID="cri-o://7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183" gracePeriod=30 Apr 16 20:11:41.842470 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842431 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn"] Apr 16 20:11:41.842848 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842835 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="storage-initializer" Apr 16 20:11:41.842900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842850 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="storage-initializer" Apr 16 20:11:41.842900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842860 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" Apr 16 20:11:41.842900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842866 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" Apr 16 20:11:41.842900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842877 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" Apr 16 20:11:41.842900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842883 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" Apr 16 20:11:41.843056 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842944 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="agent" Apr 16 20:11:41.843056 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.842955 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a51e5e49-b020-4bca-98aa-7f0c02d441ca" containerName="kserve-container" Apr 16 20:11:41.846040 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.846021 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:11:41.852218 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.852184 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn"] Apr 16 20:11:41.957750 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:41.957713 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adb6f5cf-cb62-4983-9402-45b83d9ce69b-kserve-provision-location\") pod \"isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn\" (UID: \"adb6f5cf-cb62-4983-9402-45b83d9ce69b\") " pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:11:42.058824 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:42.058744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adb6f5cf-cb62-4983-9402-45b83d9ce69b-kserve-provision-location\") pod \"isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn\" (UID: \"adb6f5cf-cb62-4983-9402-45b83d9ce69b\") " pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:11:42.059150 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:42.059129 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adb6f5cf-cb62-4983-9402-45b83d9ce69b-kserve-provision-location\") pod \"isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn\" (UID: \"adb6f5cf-cb62-4983-9402-45b83d9ce69b\") " pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:11:42.157705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:42.157661 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:11:42.286114 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:42.286085 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn"] Apr 16 20:11:42.288726 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:11:42.288694 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadb6f5cf_cb62_4983_9402_45b83d9ce69b.slice/crio-b5db92be927114a8afaccba7d64600ca2ab3ba8d2103230904694c9614a7f6e8 WatchSource:0}: Error finding container b5db92be927114a8afaccba7d64600ca2ab3ba8d2103230904694c9614a7f6e8: Status 404 returned error can't find the container with id b5db92be927114a8afaccba7d64600ca2ab3ba8d2103230904694c9614a7f6e8 Apr 16 20:11:42.293291 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:42.293259 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" event={"ID":"adb6f5cf-cb62-4983-9402-45b83d9ce69b","Type":"ContainerStarted","Data":"b5db92be927114a8afaccba7d64600ca2ab3ba8d2103230904694c9614a7f6e8"} Apr 16 20:11:43.298098 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:43.298058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" event={"ID":"adb6f5cf-cb62-4983-9402-45b83d9ce69b","Type":"ContainerStarted","Data":"15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3"} Apr 16 20:11:45.179348 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:45.179258 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:11:47.313703 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:47.313664 2569 generic.go:358] "Generic (PLEG): container finished" podID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerID="15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3" exitCode=0 Apr 16 20:11:47.314170 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:47.313743 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" event={"ID":"adb6f5cf-cb62-4983-9402-45b83d9ce69b","Type":"ContainerDied","Data":"15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3"} Apr 16 20:11:48.319109 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:48.319073 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" event={"ID":"adb6f5cf-cb62-4983-9402-45b83d9ce69b","Type":"ContainerStarted","Data":"ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8"} Apr 16 20:11:48.319539 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:48.319356 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:11:48.320688 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:48.320659 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:11:48.336018 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:48.335950 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podStartSLOduration=7.335929734 podStartE2EDuration="7.335929734s" podCreationTimestamp="2026-04-16 20:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:48.335054882 +0000 UTC m=+1072.749629733" watchObservedRunningTime="2026-04-16 20:11:48.335929734 +0000 UTC m=+1072.750504585" Apr 16 20:11:49.323130 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:49.323084 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:11:51.315097 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.315073 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:11:51.331410 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.331368 2569 generic.go:358] "Generic (PLEG): container finished" podID="5ee81949-ac46-4117-803f-7bddd653884c" containerID="7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183" exitCode=0 Apr 16 20:11:51.331643 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.331484 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" Apr 16 20:11:51.331643 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.331538 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" event={"ID":"5ee81949-ac46-4117-803f-7bddd653884c","Type":"ContainerDied","Data":"7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183"} Apr 16 20:11:51.331643 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.331569 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz" event={"ID":"5ee81949-ac46-4117-803f-7bddd653884c","Type":"ContainerDied","Data":"97ebde78245bb58b9aeb1d81b49d0b55a5e42bca01c67c86165c7dd584ec0d25"} Apr 16 20:11:51.331643 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.331613 2569 scope.go:117] "RemoveContainer" containerID="7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183" Apr 16 20:11:51.339935 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.339905 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ee81949-ac46-4117-803f-7bddd653884c-kserve-provision-location\") pod \"5ee81949-ac46-4117-803f-7bddd653884c\" (UID: \"5ee81949-ac46-4117-803f-7bddd653884c\") " Apr 16 20:11:51.340276 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.340254 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee81949-ac46-4117-803f-7bddd653884c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5ee81949-ac46-4117-803f-7bddd653884c" (UID: "5ee81949-ac46-4117-803f-7bddd653884c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:11:51.343067 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.343046 2569 scope.go:117] "RemoveContainer" containerID="3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9" Apr 16 20:11:51.351358 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.351337 2569 scope.go:117] "RemoveContainer" containerID="7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183" Apr 16 20:11:51.351707 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:11:51.351677 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183\": container with ID starting with 7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183 not found: ID does not exist" containerID="7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183" Apr 16 20:11:51.351828 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.351716 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183"} err="failed to get container status \"7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183\": rpc error: code = NotFound desc = could not find container \"7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183\": container with ID starting with 7b3b74f3ff1b1a58cc76c563e2b723983d232393963c61c979e68589d635b183 not found: ID does not exist" Apr 16 20:11:51.351828 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.351739 2569 scope.go:117] "RemoveContainer" containerID="3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9" Apr 16 20:11:51.352004 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:11:51.351984 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9\": container with ID starting with 3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9 not found: ID does not exist" containerID="3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9" Apr 16 20:11:51.352071 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.352009 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9"} err="failed to get container status \"3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9\": rpc error: code = NotFound desc = could not find container \"3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9\": container with ID starting with 3968dda923c14e6b5397585f0cd96ba77d49e86953ac99eb5d6b798a779a01a9 not found: ID does not exist" Apr 16 20:11:51.440809 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.440769 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ee81949-ac46-4117-803f-7bddd653884c-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:11:51.662093 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.662040 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz"] Apr 16 20:11:51.664254 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:51.664214 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9f1b7-predictor-854f955cf6-r49gz"] Apr 16 20:11:52.184490 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:52.184446 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee81949-ac46-4117-803f-7bddd653884c" path="/var/lib/kubelet/pods/5ee81949-ac46-4117-803f-7bddd653884c/volumes" Apr 16 20:11:59.323565 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:11:59.323509 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:12:09.324069 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:12:09.324023 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:12:19.323082 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:12:19.323037 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:12:29.323995 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:12:29.323950 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:12:39.323863 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:12:39.323816 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:12:49.323953 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:12:49.323902 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:12:59.324712 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:12:59.324678 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:13:02.025549 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.025507 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7"] Apr 16 20:13:02.026028 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.025910 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" Apr 16 20:13:02.026028 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.025922 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" Apr 16 20:13:02.026028 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.025937 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="storage-initializer" Apr 16 20:13:02.026028 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.025944 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="storage-initializer" Apr 16 20:13:02.026028 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.026005 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ee81949-ac46-4117-803f-7bddd653884c" containerName="kserve-container" Apr 16 20:13:02.029178 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.029156 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:02.031052 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.031023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-ea9dbb\"" Apr 16 20:13:02.031185 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.031023 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 20:13:02.031185 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.031059 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-ea9dbb-dockercfg-hp2gp\"" Apr 16 20:13:02.037533 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.037512 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7"] Apr 16 20:13:02.146786 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.146743 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-kserve-provision-location\") pod \"isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7\" (UID: \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\") " pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:02.146963 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.146844 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-cabundle-cert\") pod \"isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7\" (UID: \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\") " pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:02.248362 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.248323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-kserve-provision-location\") pod \"isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7\" (UID: \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\") " pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:02.248552 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.248408 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-cabundle-cert\") pod \"isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7\" (UID: \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\") " pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:02.248782 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.248760 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-kserve-provision-location\") pod \"isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7\" (UID: \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\") " pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:02.249077 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.249056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-cabundle-cert\") pod \"isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7\" (UID: \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\") " pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:02.341468 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.341393 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:02.474886 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.474855 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7"] Apr 16 20:13:02.477179 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:13:02.477146 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a3ef60_e456_4952_b71d_0e6c04dc39b4.slice/crio-85a3d676e8ef438db35861123b57cfe52b65138a4599383db84e3c40e1fb0a4c WatchSource:0}: Error finding container 85a3d676e8ef438db35861123b57cfe52b65138a4599383db84e3c40e1fb0a4c: Status 404 returned error can't find the container with id 85a3d676e8ef438db35861123b57cfe52b65138a4599383db84e3c40e1fb0a4c Apr 16 20:13:02.585972 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.585926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" event={"ID":"d6a3ef60-e456-4952-b71d-0e6c04dc39b4","Type":"ContainerStarted","Data":"8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b"} Apr 16 20:13:02.585972 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:02.585971 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" event={"ID":"d6a3ef60-e456-4952-b71d-0e6c04dc39b4","Type":"ContainerStarted","Data":"85a3d676e8ef438db35861123b57cfe52b65138a4599383db84e3c40e1fb0a4c"} Apr 16 20:13:08.607893 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:08.607867 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_d6a3ef60-e456-4952-b71d-0e6c04dc39b4/storage-initializer/0.log" Apr 16 20:13:08.608295 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:08.607908 2569 generic.go:358] "Generic (PLEG): container finished" podID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" containerID="8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b" exitCode=1 Apr 16 20:13:08.608295 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:08.607991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" event={"ID":"d6a3ef60-e456-4952-b71d-0e6c04dc39b4","Type":"ContainerDied","Data":"8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b"} Apr 16 20:13:09.613083 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:09.613053 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_d6a3ef60-e456-4952-b71d-0e6c04dc39b4/storage-initializer/0.log" Apr 16 20:13:09.613560 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:09.613179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" event={"ID":"d6a3ef60-e456-4952-b71d-0e6c04dc39b4","Type":"ContainerStarted","Data":"8dce586aa8b102fdaba74dcb442af1d2cbb1420826160083e25eeb202bb76a34"} Apr 16 20:13:13.628006 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:13.627977 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_d6a3ef60-e456-4952-b71d-0e6c04dc39b4/storage-initializer/1.log" Apr 16 20:13:13.628402 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:13.628364 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_d6a3ef60-e456-4952-b71d-0e6c04dc39b4/storage-initializer/0.log" Apr 16 20:13:13.628447 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:13.628400 2569 generic.go:358] "Generic (PLEG): container finished" podID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" containerID="8dce586aa8b102fdaba74dcb442af1d2cbb1420826160083e25eeb202bb76a34" exitCode=1 Apr 16 20:13:13.628502 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:13.628476 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" event={"ID":"d6a3ef60-e456-4952-b71d-0e6c04dc39b4","Type":"ContainerDied","Data":"8dce586aa8b102fdaba74dcb442af1d2cbb1420826160083e25eeb202bb76a34"} Apr 16 20:13:13.628538 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:13.628526 2569 scope.go:117] "RemoveContainer" containerID="8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b" Apr 16 20:13:13.628975 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:13.628960 2569 scope.go:117] "RemoveContainer" containerID="8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b" Apr 16 20:13:13.639802 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:13:13.639764 2569 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_kserve-ci-e2e-test_d6a3ef60-e456-4952-b71d-0e6c04dc39b4_0 in pod sandbox 85a3d676e8ef438db35861123b57cfe52b65138a4599383db84e3c40e1fb0a4c from index: no such id: '8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b'" containerID="8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b" Apr 16 20:13:13.639872 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:13.639818 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_kserve-ci-e2e-test_d6a3ef60-e456-4952-b71d-0e6c04dc39b4_0 in pod sandbox 85a3d676e8ef438db35861123b57cfe52b65138a4599383db84e3c40e1fb0a4c from index: no such id: '8479770202e45dde0d438c6e4bf960e464369d4ea06761e3a453d4cf49155a5b'" Apr 16 20:13:13.640080 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:13:13.640060 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_kserve-ci-e2e-test(d6a3ef60-e456-4952-b71d-0e6c04dc39b4)\"" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" podUID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" Apr 16 20:13:14.633946 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:14.633874 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_d6a3ef60-e456-4952-b71d-0e6c04dc39b4/storage-initializer/1.log" Apr 16 20:13:20.122562 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.122524 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7"] Apr 16 20:13:20.177596 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.177547 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn"] Apr 16 20:13:20.177869 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.177844 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" containerID="cri-o://ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8" gracePeriod=30 Apr 16 20:13:20.264530 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.264495 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67"] Apr 16 20:13:20.268719 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.268691 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_d6a3ef60-e456-4952-b71d-0e6c04dc39b4/storage-initializer/1.log" Apr 16 20:13:20.268871 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.268764 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:20.270137 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.270118 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:20.271947 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.271921 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-2519ca-dockercfg-5ckwt\"" Apr 16 20:13:20.272741 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.272715 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-2519ca\"" Apr 16 20:13:20.273622 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.273567 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67"] Apr 16 20:13:20.413864 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.413823 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-cabundle-cert\") pod \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\" (UID: \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\") " Apr 16 20:13:20.414062 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.413888 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-kserve-provision-location\") pod \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\" (UID: \"d6a3ef60-e456-4952-b71d-0e6c04dc39b4\") " Apr 16 20:13:20.414062 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.414006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2408651-840a-4a52-a4f1-ab7e049dfa2f-kserve-provision-location\") pod \"isvc-init-fail-2519ca-predictor-bc7f75588-pdl67\" (UID: \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\") " pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:20.414062 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.414051 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b2408651-840a-4a52-a4f1-ab7e049dfa2f-cabundle-cert\") pod \"isvc-init-fail-2519ca-predictor-bc7f75588-pdl67\" (UID: \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\") " pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:20.414288 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.414200 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d6a3ef60-e456-4952-b71d-0e6c04dc39b4" (UID: "d6a3ef60-e456-4952-b71d-0e6c04dc39b4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:20.414288 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.414225 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "d6a3ef60-e456-4952-b71d-0e6c04dc39b4" (UID: "d6a3ef60-e456-4952-b71d-0e6c04dc39b4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:20.515466 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.515415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2408651-840a-4a52-a4f1-ab7e049dfa2f-kserve-provision-location\") pod \"isvc-init-fail-2519ca-predictor-bc7f75588-pdl67\" (UID: \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\") " pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:20.515705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.515509 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b2408651-840a-4a52-a4f1-ab7e049dfa2f-cabundle-cert\") pod \"isvc-init-fail-2519ca-predictor-bc7f75588-pdl67\" (UID: \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\") " pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:20.515705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.515616 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-cabundle-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:13:20.515705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.515633 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d6a3ef60-e456-4952-b71d-0e6c04dc39b4-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:13:20.515872 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.515848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2408651-840a-4a52-a4f1-ab7e049dfa2f-kserve-provision-location\") pod \"isvc-init-fail-2519ca-predictor-bc7f75588-pdl67\" (UID: \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\") " pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:20.516161 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.516142 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b2408651-840a-4a52-a4f1-ab7e049dfa2f-cabundle-cert\") pod \"isvc-init-fail-2519ca-predictor-bc7f75588-pdl67\" (UID: \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\") " pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:20.582321 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.582287 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:20.657719 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.657690 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7_d6a3ef60-e456-4952-b71d-0e6c04dc39b4/storage-initializer/1.log" Apr 16 20:13:20.657900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.657788 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" event={"ID":"d6a3ef60-e456-4952-b71d-0e6c04dc39b4","Type":"ContainerDied","Data":"85a3d676e8ef438db35861123b57cfe52b65138a4599383db84e3c40e1fb0a4c"} Apr 16 20:13:20.657900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.657817 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7" Apr 16 20:13:20.657900 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.657831 2569 scope.go:117] "RemoveContainer" containerID="8dce586aa8b102fdaba74dcb442af1d2cbb1420826160083e25eeb202bb76a34" Apr 16 20:13:20.693731 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.693694 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7"] Apr 16 20:13:20.698310 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.698273 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ea9dbb-predictor-647dbdd946-trtm7"] Apr 16 20:13:20.717338 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:20.717305 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67"] Apr 16 20:13:20.720679 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:13:20.720648 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2408651_840a_4a52_a4f1_ab7e049dfa2f.slice/crio-daab3d16219ff6eb650cc05606af93a83265efe2530132cce30aea53001c4f24 WatchSource:0}: Error finding container daab3d16219ff6eb650cc05606af93a83265efe2530132cce30aea53001c4f24: Status 404 returned error can't find the container with id daab3d16219ff6eb650cc05606af93a83265efe2530132cce30aea53001c4f24 Apr 16 20:13:21.663587 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:21.663539 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" event={"ID":"b2408651-840a-4a52-a4f1-ab7e049dfa2f","Type":"ContainerStarted","Data":"197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9"} Apr 16 20:13:21.663587 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:21.663587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" event={"ID":"b2408651-840a-4a52-a4f1-ab7e049dfa2f","Type":"ContainerStarted","Data":"daab3d16219ff6eb650cc05606af93a83265efe2530132cce30aea53001c4f24"} Apr 16 20:13:22.184422 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:22.184384 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" path="/var/lib/kubelet/pods/d6a3ef60-e456-4952-b71d-0e6c04dc39b4/volumes" Apr 16 20:13:24.825643 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:24.825615 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:13:24.954461 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:24.954424 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adb6f5cf-cb62-4983-9402-45b83d9ce69b-kserve-provision-location\") pod \"adb6f5cf-cb62-4983-9402-45b83d9ce69b\" (UID: \"adb6f5cf-cb62-4983-9402-45b83d9ce69b\") " Apr 16 20:13:24.954803 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:24.954779 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb6f5cf-cb62-4983-9402-45b83d9ce69b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "adb6f5cf-cb62-4983-9402-45b83d9ce69b" (UID: "adb6f5cf-cb62-4983-9402-45b83d9ce69b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:25.056025 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.055982 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adb6f5cf-cb62-4983-9402-45b83d9ce69b-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:13:25.678182 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.678145 2569 generic.go:358] "Generic (PLEG): container finished" podID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerID="ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8" exitCode=0 Apr 16 20:13:25.678417 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.678222 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" Apr 16 20:13:25.678417 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.678234 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" event={"ID":"adb6f5cf-cb62-4983-9402-45b83d9ce69b","Type":"ContainerDied","Data":"ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8"} Apr 16 20:13:25.678417 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.678281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn" event={"ID":"adb6f5cf-cb62-4983-9402-45b83d9ce69b","Type":"ContainerDied","Data":"b5db92be927114a8afaccba7d64600ca2ab3ba8d2103230904694c9614a7f6e8"} Apr 16 20:13:25.678417 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.678305 2569 scope.go:117] "RemoveContainer" containerID="ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8" Apr 16 20:13:25.686672 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.686650 2569 scope.go:117] "RemoveContainer" containerID="15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3" Apr 16 20:13:25.694610 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.694559 2569 scope.go:117] "RemoveContainer" containerID="ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8" Apr 16 20:13:25.694940 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:13:25.694919 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8\": container with ID starting with ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8 not found: ID does not exist" containerID="ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8" Apr 16 20:13:25.695022 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.694953 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8"} err="failed to get container status \"ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8\": rpc error: code = NotFound desc = could not find container \"ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8\": container with ID starting with ff4ed90794dde3c2516480b969654ea4495c2f8d9e8bb0d45ea1f9fb9e1e09c8 not found: ID does not exist" Apr 16 20:13:25.695022 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.694985 2569 scope.go:117] "RemoveContainer" containerID="15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3" Apr 16 20:13:25.695259 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:13:25.695242 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3\": container with ID starting with 15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3 not found: ID does not exist" containerID="15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3" Apr 16 20:13:25.695316 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.695266 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3"} err="failed to get container status \"15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3\": rpc error: code = NotFound desc = could not find container \"15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3\": container with ID starting with 15bb9d8806acf5f15ebaae86dd2b040d261e7f53addbda40f4076525b8a61fc3 not found: ID does not exist" Apr 16 20:13:25.701684 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.701641 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn"] Apr 16 20:13:25.707360 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:25.707329 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ea9dbb-predictor-5b757bfb59-xw9mn"] Apr 16 20:13:26.184645 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:26.184613 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" path="/var/lib/kubelet/pods/adb6f5cf-cb62-4983-9402-45b83d9ce69b/volumes" Apr 16 20:13:26.684312 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:26.684278 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2519ca-predictor-bc7f75588-pdl67_b2408651-840a-4a52-a4f1-ab7e049dfa2f/storage-initializer/0.log" Apr 16 20:13:26.684475 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:26.684318 2569 generic.go:358] "Generic (PLEG): container finished" podID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerID="197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9" exitCode=1 Apr 16 20:13:26.684475 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:26.684374 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" event={"ID":"b2408651-840a-4a52-a4f1-ab7e049dfa2f","Type":"ContainerDied","Data":"197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9"} Apr 16 20:13:27.690023 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:27.689995 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2519ca-predictor-bc7f75588-pdl67_b2408651-840a-4a52-a4f1-ab7e049dfa2f/storage-initializer/0.log" Apr 16 20:13:27.690407 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:27.690064 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" event={"ID":"b2408651-840a-4a52-a4f1-ab7e049dfa2f","Type":"ContainerStarted","Data":"9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b"} Apr 16 20:13:30.288875 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.288830 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67"] Apr 16 20:13:30.289387 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.289204 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" podUID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerName="storage-initializer" containerID="cri-o://9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b" gracePeriod=30 Apr 16 20:13:30.392055 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392021 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw"] Apr 16 20:13:30.392449 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392435 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" containerName="storage-initializer" Apr 16 20:13:30.392494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392451 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" containerName="storage-initializer" Apr 16 20:13:30.392494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392459 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="storage-initializer" Apr 16 20:13:30.392494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392466 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="storage-initializer" Apr 16 20:13:30.392494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392473 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" Apr 16 20:13:30.392494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392478 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" Apr 16 20:13:30.392494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392489 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" containerName="storage-initializer" Apr 16 20:13:30.392494 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392494 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" containerName="storage-initializer" Apr 16 20:13:30.392730 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392548 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="adb6f5cf-cb62-4983-9402-45b83d9ce69b" containerName="kserve-container" Apr 16 20:13:30.392730 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392558 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" containerName="storage-initializer" Apr 16 20:13:30.392730 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.392566 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6a3ef60-e456-4952-b71d-0e6c04dc39b4" containerName="storage-initializer" Apr 16 20:13:30.396012 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.395982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:13:30.397921 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.397888 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g6xwp\"" Apr 16 20:13:30.403710 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.403675 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw"] Apr 16 20:13:30.502365 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.502319 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab312f2b-0b40-4fbe-871b-7124abc8a68d-kserve-provision-location\") pod \"raw-sklearn-05c63-predictor-695786cf5d-6qhgw\" (UID: \"ab312f2b-0b40-4fbe-871b-7124abc8a68d\") " pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:13:30.565607 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.565554 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2519ca-predictor-bc7f75588-pdl67_b2408651-840a-4a52-a4f1-ab7e049dfa2f/storage-initializer/1.log" Apr 16 20:13:30.565985 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.565968 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2519ca-predictor-bc7f75588-pdl67_b2408651-840a-4a52-a4f1-ab7e049dfa2f/storage-initializer/0.log" Apr 16 20:13:30.566053 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.566038 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:30.603913 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.603864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab312f2b-0b40-4fbe-871b-7124abc8a68d-kserve-provision-location\") pod \"raw-sklearn-05c63-predictor-695786cf5d-6qhgw\" (UID: \"ab312f2b-0b40-4fbe-871b-7124abc8a68d\") " pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:13:30.604297 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.604270 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab312f2b-0b40-4fbe-871b-7124abc8a68d-kserve-provision-location\") pod \"raw-sklearn-05c63-predictor-695786cf5d-6qhgw\" (UID: \"ab312f2b-0b40-4fbe-871b-7124abc8a68d\") " pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:13:30.702137 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.702107 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2519ca-predictor-bc7f75588-pdl67_b2408651-840a-4a52-a4f1-ab7e049dfa2f/storage-initializer/1.log" Apr 16 20:13:30.702484 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.702467 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2519ca-predictor-bc7f75588-pdl67_b2408651-840a-4a52-a4f1-ab7e049dfa2f/storage-initializer/0.log" Apr 16 20:13:30.702543 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.702504 2569 generic.go:358] "Generic (PLEG): container finished" podID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerID="9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b" exitCode=1 Apr 16 20:13:30.702615 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.702548 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" event={"ID":"b2408651-840a-4a52-a4f1-ab7e049dfa2f","Type":"ContainerDied","Data":"9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b"} Apr 16 20:13:30.702615 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.702606 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" Apr 16 20:13:30.702698 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.702620 2569 scope.go:117] "RemoveContainer" containerID="9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b" Apr 16 20:13:30.702743 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.702572 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67" event={"ID":"b2408651-840a-4a52-a4f1-ab7e049dfa2f","Type":"ContainerDied","Data":"daab3d16219ff6eb650cc05606af93a83265efe2530132cce30aea53001c4f24"} Apr 16 20:13:30.704774 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.704755 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2408651-840a-4a52-a4f1-ab7e049dfa2f-kserve-provision-location\") pod \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\" (UID: \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\") " Apr 16 20:13:30.704909 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.704867 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b2408651-840a-4a52-a4f1-ab7e049dfa2f-cabundle-cert\") pod \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\" (UID: \"b2408651-840a-4a52-a4f1-ab7e049dfa2f\") " Apr 16 20:13:30.705026 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.705000 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2408651-840a-4a52-a4f1-ab7e049dfa2f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b2408651-840a-4a52-a4f1-ab7e049dfa2f" (UID: "b2408651-840a-4a52-a4f1-ab7e049dfa2f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:30.705265 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.705242 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2408651-840a-4a52-a4f1-ab7e049dfa2f-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "b2408651-840a-4a52-a4f1-ab7e049dfa2f" (UID: "b2408651-840a-4a52-a4f1-ab7e049dfa2f"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:30.713705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.713670 2569 scope.go:117] "RemoveContainer" containerID="197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9" Apr 16 20:13:30.722157 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.722130 2569 scope.go:117] "RemoveContainer" containerID="9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b" Apr 16 20:13:30.722495 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:13:30.722474 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b\": container with ID starting with 9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b not found: ID does not exist" containerID="9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b" Apr 16 20:13:30.722558 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.722508 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b"} err="failed to get container status \"9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b\": rpc error: code = NotFound desc = could not find container \"9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b\": container with ID starting with 9100b30ab1fc2ea632c817f05aba79e40914090c0f2c563857b3913e37f3529b not found: ID does not exist" Apr 16 20:13:30.722558 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.722530 2569 scope.go:117] "RemoveContainer" containerID="197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9" Apr 16 20:13:30.722846 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:13:30.722827 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9\": container with ID starting with 197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9 not found: ID does not exist" containerID="197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9" Apr 16 20:13:30.722915 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.722850 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9"} err="failed to get container status \"197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9\": rpc error: code = NotFound desc = could not find container \"197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9\": container with ID starting with 197b8749d98f6d22fb9b9ff1a48a6e53eea62209588b9a59bd50df0640393ec9 not found: ID does not exist" Apr 16 20:13:30.737034 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.737000 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:13:30.806008 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.805925 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b2408651-840a-4a52-a4f1-ab7e049dfa2f-cabundle-cert\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:13:30.806008 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.805958 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2408651-840a-4a52-a4f1-ab7e049dfa2f-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:13:30.868727 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:30.868668 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw"] Apr 16 20:13:30.870950 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:13:30.870916 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab312f2b_0b40_4fbe_871b_7124abc8a68d.slice/crio-a69c7cba378d445aa474b23a2f19421ce202e927b4e0063db113df1c15b8c6bb WatchSource:0}: Error finding container a69c7cba378d445aa474b23a2f19421ce202e927b4e0063db113df1c15b8c6bb: Status 404 returned error can't find the container with id a69c7cba378d445aa474b23a2f19421ce202e927b4e0063db113df1c15b8c6bb Apr 16 20:13:31.039501 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:31.039463 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67"] Apr 16 20:13:31.044096 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:31.044053 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2519ca-predictor-bc7f75588-pdl67"] Apr 16 20:13:31.707899 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:31.707858 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" event={"ID":"ab312f2b-0b40-4fbe-871b-7124abc8a68d","Type":"ContainerStarted","Data":"bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b"} Apr 16 20:13:31.707899 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:31.707900 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" event={"ID":"ab312f2b-0b40-4fbe-871b-7124abc8a68d","Type":"ContainerStarted","Data":"a69c7cba378d445aa474b23a2f19421ce202e927b4e0063db113df1c15b8c6bb"} Apr 16 20:13:32.184684 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:32.184652 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" path="/var/lib/kubelet/pods/b2408651-840a-4a52-a4f1-ab7e049dfa2f/volumes" Apr 16 20:13:34.720455 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:34.720421 2569 generic.go:358] "Generic (PLEG): container finished" podID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerID="bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b" exitCode=0 Apr 16 20:13:34.720830 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:34.720499 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" event={"ID":"ab312f2b-0b40-4fbe-871b-7124abc8a68d","Type":"ContainerDied","Data":"bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b"} Apr 16 20:13:35.726673 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:35.726638 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" event={"ID":"ab312f2b-0b40-4fbe-871b-7124abc8a68d","Type":"ContainerStarted","Data":"264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0"} Apr 16 20:13:35.727099 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:35.726931 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:13:35.728462 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:35.728434 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:13:35.745548 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:35.745494 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podStartSLOduration=5.745479181 podStartE2EDuration="5.745479181s" podCreationTimestamp="2026-04-16 20:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:35.744318765 +0000 UTC m=+1180.158893614" watchObservedRunningTime="2026-04-16 20:13:35.745479181 +0000 UTC m=+1180.160054028" Apr 16 20:13:36.731597 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:36.731532 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:13:46.732297 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:46.732248 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:13:56.115383 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:56.115354 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 20:13:56.120112 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:56.120088 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 20:13:56.731664 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:13:56.731619 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:14:06.732093 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:06.732045 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:14:16.731847 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:16.731797 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:14:26.731927 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:26.731876 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:14:36.732333 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:36.732285 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:14:37.179061 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:37.179016 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:14:47.179832 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:47.179747 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:14:50.507301 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.507265 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw"] Apr 16 20:14:50.507808 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.507500 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" containerID="cri-o://264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0" gracePeriod=30 Apr 16 20:14:50.581962 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.581928 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7"] Apr 16 20:14:50.582319 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.582306 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerName="storage-initializer" Apr 16 20:14:50.582371 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.582321 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerName="storage-initializer" Apr 16 20:14:50.582410 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.582399 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerName="storage-initializer" Apr 16 20:14:50.582447 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.582409 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerName="storage-initializer" Apr 16 20:14:50.582481 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.582469 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerName="storage-initializer" Apr 16 20:14:50.582481 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.582481 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2408651-840a-4a52-a4f1-ab7e049dfa2f" containerName="storage-initializer" Apr 16 20:14:50.585608 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.585562 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:14:50.591674 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.591590 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7"] Apr 16 20:14:50.695438 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.695398 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d01517-0f0c-4405-9c3f-7d59fc59b1ab-kserve-provision-location\") pod \"raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7\" (UID: \"87d01517-0f0c-4405-9c3f-7d59fc59b1ab\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:14:50.796877 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.796783 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d01517-0f0c-4405-9c3f-7d59fc59b1ab-kserve-provision-location\") pod \"raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7\" (UID: \"87d01517-0f0c-4405-9c3f-7d59fc59b1ab\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:14:50.797187 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.797166 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d01517-0f0c-4405-9c3f-7d59fc59b1ab-kserve-provision-location\") pod \"raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7\" (UID: \"87d01517-0f0c-4405-9c3f-7d59fc59b1ab\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:14:50.897765 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:50.897712 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:14:51.027186 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:51.027149 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7"] Apr 16 20:14:51.030005 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:14:51.029970 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87d01517_0f0c_4405_9c3f_7d59fc59b1ab.slice/crio-2c2f1ecfc5bc868b66caa7b066d67a20ba1de110cde07e4d5aac5ee3569247e9 WatchSource:0}: Error finding container 2c2f1ecfc5bc868b66caa7b066d67a20ba1de110cde07e4d5aac5ee3569247e9: Status 404 returned error can't find the container with id 2c2f1ecfc5bc868b66caa7b066d67a20ba1de110cde07e4d5aac5ee3569247e9 Apr 16 20:14:51.032010 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:51.031991 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:14:52.002945 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:52.002911 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" event={"ID":"87d01517-0f0c-4405-9c3f-7d59fc59b1ab","Type":"ContainerStarted","Data":"24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6"} Apr 16 20:14:52.002945 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:52.002947 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" event={"ID":"87d01517-0f0c-4405-9c3f-7d59fc59b1ab","Type":"ContainerStarted","Data":"2c2f1ecfc5bc868b66caa7b066d67a20ba1de110cde07e4d5aac5ee3569247e9"} Apr 16 20:14:55.014621 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:55.014566 2569 generic.go:358] "Generic (PLEG): container finished" podID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerID="24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6" exitCode=0 Apr 16 20:14:55.015027 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:55.014642 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" event={"ID":"87d01517-0f0c-4405-9c3f-7d59fc59b1ab","Type":"ContainerDied","Data":"24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6"} Apr 16 20:14:55.353460 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:55.353430 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:14:55.442424 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:55.442395 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab312f2b-0b40-4fbe-871b-7124abc8a68d-kserve-provision-location\") pod \"ab312f2b-0b40-4fbe-871b-7124abc8a68d\" (UID: \"ab312f2b-0b40-4fbe-871b-7124abc8a68d\") " Apr 16 20:14:55.442808 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:55.442781 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab312f2b-0b40-4fbe-871b-7124abc8a68d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ab312f2b-0b40-4fbe-871b-7124abc8a68d" (UID: "ab312f2b-0b40-4fbe-871b-7124abc8a68d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:55.543254 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:55.543214 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab312f2b-0b40-4fbe-871b-7124abc8a68d-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:14:56.019573 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.019510 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" event={"ID":"87d01517-0f0c-4405-9c3f-7d59fc59b1ab","Type":"ContainerStarted","Data":"e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912"} Apr 16 20:14:56.020030 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.019902 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:14:56.021059 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.021029 2569 generic.go:358] "Generic (PLEG): container finished" podID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerID="264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0" exitCode=0 Apr 16 20:14:56.021198 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.021072 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" event={"ID":"ab312f2b-0b40-4fbe-871b-7124abc8a68d","Type":"ContainerDied","Data":"264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0"} Apr 16 20:14:56.021198 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.021095 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" event={"ID":"ab312f2b-0b40-4fbe-871b-7124abc8a68d","Type":"ContainerDied","Data":"a69c7cba378d445aa474b23a2f19421ce202e927b4e0063db113df1c15b8c6bb"} Apr 16 20:14:56.021198 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.021106 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw" Apr 16 20:14:56.021198 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.021115 2569 scope.go:117] "RemoveContainer" containerID="264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0" Apr 16 20:14:56.021631 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.021601 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:14:56.030041 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.030000 2569 scope.go:117] "RemoveContainer" containerID="bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b" Apr 16 20:14:56.045030 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.044962 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podStartSLOduration=6.044942192 podStartE2EDuration="6.044942192s" podCreationTimestamp="2026-04-16 20:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:56.043316239 +0000 UTC m=+1260.457891102" watchObservedRunningTime="2026-04-16 20:14:56.044942192 +0000 UTC m=+1260.459517042" Apr 16 20:14:56.048281 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.048252 2569 scope.go:117] "RemoveContainer" containerID="264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0" Apr 16 20:14:56.048613 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:14:56.048571 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0\": container with ID starting with 264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0 not found: ID does not exist" containerID="264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0" Apr 16 20:14:56.048699 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.048619 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0"} err="failed to get container status \"264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0\": rpc error: code = NotFound desc = could not find container \"264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0\": container with ID starting with 264e48c40da1cc765ae6526501f0f6643fc6e0e130cdcd8c10557358f25687c0 not found: ID does not exist" Apr 16 20:14:56.048699 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.048638 2569 scope.go:117] "RemoveContainer" containerID="bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b" Apr 16 20:14:56.048903 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:14:56.048882 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b\": container with ID starting with bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b not found: ID does not exist" containerID="bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b" Apr 16 20:14:56.048965 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.048915 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b"} err="failed to get container status \"bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b\": rpc error: code = NotFound desc = could not find container \"bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b\": container with ID starting with bfba32604afeb65d3f031dde69f69d2144a5f2593b9a61e344dc7a61ce21566b not found: ID does not exist" Apr 16 20:14:56.055838 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.055808 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw"] Apr 16 20:14:56.059357 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.059327 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-05c63-predictor-695786cf5d-6qhgw"] Apr 16 20:14:56.184725 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:56.184690 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" path="/var/lib/kubelet/pods/ab312f2b-0b40-4fbe-871b-7124abc8a68d/volumes" Apr 16 20:14:57.024945 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:14:57.024905 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:15:07.025745 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:15:07.025695 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:15:17.025296 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:15:17.025249 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:15:27.025827 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:15:27.025773 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:15:37.025661 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:15:37.025607 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:15:47.025216 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:15:47.025170 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:15:57.025299 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:15:57.025249 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:16:07.026881 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:07.026849 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:16:10.724426 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:10.724388 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7"] Apr 16 20:16:10.724848 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:10.724680 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" containerID="cri-o://e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912" gracePeriod=30 Apr 16 20:16:15.581955 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:15.581929 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:16:15.754184 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:15.754145 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d01517-0f0c-4405-9c3f-7d59fc59b1ab-kserve-provision-location\") pod \"87d01517-0f0c-4405-9c3f-7d59fc59b1ab\" (UID: \"87d01517-0f0c-4405-9c3f-7d59fc59b1ab\") " Apr 16 20:16:15.754526 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:15.754499 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87d01517-0f0c-4405-9c3f-7d59fc59b1ab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87d01517-0f0c-4405-9c3f-7d59fc59b1ab" (UID: "87d01517-0f0c-4405-9c3f-7d59fc59b1ab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:15.855655 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:15.855611 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87d01517-0f0c-4405-9c3f-7d59fc59b1ab-kserve-provision-location\") on node \"ip-10-0-139-205.ec2.internal\" DevicePath \"\"" Apr 16 20:16:16.305485 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.305446 2569 generic.go:358] "Generic (PLEG): container finished" podID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerID="e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912" exitCode=0 Apr 16 20:16:16.305705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.305524 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" Apr 16 20:16:16.305705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.305526 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" event={"ID":"87d01517-0f0c-4405-9c3f-7d59fc59b1ab","Type":"ContainerDied","Data":"e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912"} Apr 16 20:16:16.305705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.305627 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7" event={"ID":"87d01517-0f0c-4405-9c3f-7d59fc59b1ab","Type":"ContainerDied","Data":"2c2f1ecfc5bc868b66caa7b066d67a20ba1de110cde07e4d5aac5ee3569247e9"} Apr 16 20:16:16.305705 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.305646 2569 scope.go:117] "RemoveContainer" containerID="e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912" Apr 16 20:16:16.314420 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.314399 2569 scope.go:117] "RemoveContainer" containerID="24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6" Apr 16 20:16:16.320936 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.320903 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7"] Apr 16 20:16:16.324108 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.324087 2569 scope.go:117] "RemoveContainer" containerID="e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912" Apr 16 20:16:16.324488 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:16:16.324462 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912\": container with ID starting with e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912 not found: ID does not exist" containerID="e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912" Apr 16 20:16:16.324620 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.324499 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912"} err="failed to get container status \"e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912\": rpc error: code = NotFound desc = could not find container \"e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912\": container with ID starting with e7a795a8f2c18615d5c6bdc9c447b3f5cdc4792157c57277c878052729af6912 not found: ID does not exist" Apr 16 20:16:16.324620 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.324526 2569 scope.go:117] "RemoveContainer" containerID="24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6" Apr 16 20:16:16.324620 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.324569 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-a6d09-predictor-7dfcbcc5bb-5hvz7"] Apr 16 20:16:16.324880 ip-10-0-139-205 kubenswrapper[2569]: E0416 20:16:16.324861 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6\": container with ID starting with 24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6 not found: ID does not exist" containerID="24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6" Apr 16 20:16:16.324925 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:16.324888 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6"} err="failed to get container status \"24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6\": rpc error: code = NotFound desc = could not find container \"24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6\": container with ID starting with 24f3123d9e43c4c99e43d8bc162f25918af020201e826316b25f96b58c27bee6 not found: ID does not exist" Apr 16 20:16:18.183956 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:18.183922 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" path="/var/lib/kubelet/pods/87d01517-0f0c-4405-9c3f-7d59fc59b1ab/volumes" Apr 16 20:16:40.333571 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:40.333520 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zkxnw_b7a86d58-955a-4af8-9ae0-c6e786f43b28/global-pull-secret-syncer/0.log" Apr 16 20:16:40.436125 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:40.436095 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qcjrs_89ed11af-8c8e-432e-9b8a-3696d6697184/konnectivity-agent/0.log" Apr 16 20:16:40.505628 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:40.505600 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-205.ec2.internal_05778b47a5ef098ec7584b65b41dff7a/haproxy/0.log" Apr 16 20:16:43.939089 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:43.939061 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bf94de88-5e28-47c2-b79a-0d38938b1c5c/alertmanager/0.log" Apr 16 20:16:43.965373 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:43.965339 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bf94de88-5e28-47c2-b79a-0d38938b1c5c/config-reloader/0.log" Apr 16 20:16:43.994789 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:43.994745 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bf94de88-5e28-47c2-b79a-0d38938b1c5c/kube-rbac-proxy-web/0.log" Apr 16 20:16:44.019928 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.019901 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bf94de88-5e28-47c2-b79a-0d38938b1c5c/kube-rbac-proxy/0.log" Apr 16 20:16:44.046165 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.046138 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bf94de88-5e28-47c2-b79a-0d38938b1c5c/kube-rbac-proxy-metric/0.log" Apr 16 20:16:44.087101 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.087075 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bf94de88-5e28-47c2-b79a-0d38938b1c5c/prom-label-proxy/0.log" Apr 16 20:16:44.112394 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.112369 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bf94de88-5e28-47c2-b79a-0d38938b1c5c/init-config-reloader/0.log" Apr 16 20:16:44.150314 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.150278 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-49lhf_e85eca51-eae1-4ef0-b008-1a1e1b796b4c/cluster-monitoring-operator/0.log" Apr 16 20:16:44.319568 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.319489 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2mssn_52860252-cf2f-4da1-9834-49ba663cc555/node-exporter/0.log" Apr 16 20:16:44.340565 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.340537 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2mssn_52860252-cf2f-4da1-9834-49ba663cc555/kube-rbac-proxy/0.log" Apr 16 20:16:44.363312 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.363285 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2mssn_52860252-cf2f-4da1-9834-49ba663cc555/init-textfile/0.log" Apr 16 20:16:44.549318 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.549289 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4h95l_e9dd998f-62f0-406e-bf31-cca545dc9b5d/kube-rbac-proxy-main/0.log" Apr 16 20:16:44.574057 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.573983 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4h95l_e9dd998f-62f0-406e-bf31-cca545dc9b5d/kube-rbac-proxy-self/0.log" Apr 16 20:16:44.598039 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.598007 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4h95l_e9dd998f-62f0-406e-bf31-cca545dc9b5d/openshift-state-metrics/0.log" Apr 16 20:16:44.873304 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.873225 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b9785d66f-s2bwr_440343c1-f829-47fe-9627-0e58df180985/telemeter-client/0.log" Apr 16 20:16:44.895614 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.895565 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b9785d66f-s2bwr_440343c1-f829-47fe-9627-0e58df180985/reload/0.log" Apr 16 20:16:44.923141 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.923114 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b9785d66f-s2bwr_440343c1-f829-47fe-9627-0e58df180985/kube-rbac-proxy/0.log" Apr 16 20:16:44.954633 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.954605 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54f68c57f4-2vg75_897a6728-45f7-4fd3-9046-d545dc2704e6/thanos-query/0.log" Apr 16 20:16:44.978745 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:44.978716 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54f68c57f4-2vg75_897a6728-45f7-4fd3-9046-d545dc2704e6/kube-rbac-proxy-web/0.log" Apr 16 20:16:45.000556 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:45.000528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54f68c57f4-2vg75_897a6728-45f7-4fd3-9046-d545dc2704e6/kube-rbac-proxy/0.log" Apr 16 20:16:45.022423 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:45.022397 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54f68c57f4-2vg75_897a6728-45f7-4fd3-9046-d545dc2704e6/prom-label-proxy/0.log" Apr 16 20:16:45.045182 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:45.045156 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54f68c57f4-2vg75_897a6728-45f7-4fd3-9046-d545dc2704e6/kube-rbac-proxy-rules/0.log" Apr 16 20:16:45.070958 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:45.070929 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-54f68c57f4-2vg75_897a6728-45f7-4fd3-9046-d545dc2704e6/kube-rbac-proxy-metrics/0.log" Apr 16 20:16:46.294899 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:46.294866 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-rk7vq_8ebca5c0-30bf-46c6-a6e8-4cc5860aa2d6/networking-console-plugin/0.log" Apr 16 20:16:46.720782 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:46.720753 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/2.log" Apr 16 20:16:46.724556 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:46.724531 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjrs6_ce138de6-668e-4e27-b7d0-579a176ea2f2/console-operator/3.log" Apr 16 20:16:47.128058 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.127983 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b7f7d9c7d-d4jp2_0147df20-106b-4a7a-a8ad-ea3ce5d89feb/console/0.log" Apr 16 20:16:47.197813 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.197783 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-4dmst_a6209680-e76b-4b41-a4df-5d5f476a1df3/download-server/0.log" Apr 16 20:16:47.569137 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569106 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw"] Apr 16 20:16:47.569510 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569502 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" Apr 16 20:16:47.569564 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569516 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" Apr 16 20:16:47.569564 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569529 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" Apr 16 20:16:47.569564 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569535 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" Apr 16 20:16:47.569564 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569544 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="storage-initializer" Apr 16 20:16:47.569759 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569550 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="storage-initializer" Apr 16 20:16:47.569759 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569591 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="storage-initializer" Apr 16 20:16:47.569759 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569600 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="storage-initializer" Apr 16 20:16:47.569759 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569664 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="87d01517-0f0c-4405-9c3f-7d59fc59b1ab" containerName="kserve-container" Apr 16 20:16:47.569759 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.569675 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab312f2b-0b40-4fbe-871b-7124abc8a68d" containerName="kserve-container" Apr 16 20:16:47.572995 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.572972 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.574847 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.574824 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9spp\"/\"openshift-service-ca.crt\"" Apr 16 20:16:47.575142 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.575127 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h9spp\"/\"default-dockercfg-lq8qc\"" Apr 16 20:16:47.575202 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.575137 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9spp\"/\"kube-root-ca.crt\"" Apr 16 20:16:47.578504 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.578478 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw"] Apr 16 20:16:47.631201 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.631161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-lib-modules\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.631201 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.631201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-proc\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.631449 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.631220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-sys\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.631449 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.631358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhz9c\" (UniqueName: \"kubernetes.io/projected/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-kube-api-access-jhz9c\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.631449 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.631400 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-podres\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.662988 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.662961 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-hj2q8_aafa3559-b7eb-410f-91aa-abf590bd5c4a/volume-data-source-validator/0.log" Apr 16 20:16:47.732428 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhz9c\" (UniqueName: \"kubernetes.io/projected/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-kube-api-access-jhz9c\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.732663 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-podres\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.732663 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-lib-modules\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.732663 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732535 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-proc\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.732663 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-sys\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.732663 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-podres\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.732663 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-proc\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.732916 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-sys\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.732916 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.732701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-lib-modules\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.741421 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.741396 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhz9c\" (UniqueName: \"kubernetes.io/projected/27e3fd52-d388-4ff8-8ae9-2d85f96a7591-kube-api-access-jhz9c\") pod \"perf-node-gather-daemonset-trwlw\" (UID: \"27e3fd52-d388-4ff8-8ae9-2d85f96a7591\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:47.883752 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:47.883648 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:48.016784 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:48.016703 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw"] Apr 16 20:16:48.019324 ip-10-0-139-205 kubenswrapper[2569]: W0416 20:16:48.019295 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod27e3fd52_d388_4ff8_8ae9_2d85f96a7591.slice/crio-fc95f60b788206f2e93679ca293f1b28f2f004e078f0edb30c2ce1ffa783af67 WatchSource:0}: Error finding container fc95f60b788206f2e93679ca293f1b28f2f004e078f0edb30c2ce1ffa783af67: Status 404 returned error can't find the container with id fc95f60b788206f2e93679ca293f1b28f2f004e078f0edb30c2ce1ffa783af67 Apr 16 20:16:48.421314 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:48.421281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" event={"ID":"27e3fd52-d388-4ff8-8ae9-2d85f96a7591","Type":"ContainerStarted","Data":"a608d127691f8b44f20c38544e5e69d5271bb9c547581844b558c8e7510b0aa2"} Apr 16 20:16:48.421314 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:48.421317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" event={"ID":"27e3fd52-d388-4ff8-8ae9-2d85f96a7591","Type":"ContainerStarted","Data":"fc95f60b788206f2e93679ca293f1b28f2f004e078f0edb30c2ce1ffa783af67"} Apr 16 20:16:48.421524 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:48.421342 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:48.434919 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:48.434856 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" podStartSLOduration=1.434841569 podStartE2EDuration="1.434841569s" podCreationTimestamp="2026-04-16 20:16:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:48.43461176 +0000 UTC m=+1372.849186608" watchObservedRunningTime="2026-04-16 20:16:48.434841569 +0000 UTC m=+1372.849416421" Apr 16 20:16:48.473635 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:48.473604 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w77xr_28edf32a-262a-4c91-89da-2c452e8d1152/dns/0.log" Apr 16 20:16:48.494687 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:48.494660 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w77xr_28edf32a-262a-4c91-89da-2c452e8d1152/kube-rbac-proxy/0.log" Apr 16 20:16:48.518157 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:48.518125 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tg5c9_448cba83-b62d-4e46-b69b-e948817d0ec5/dns-node-resolver/0.log" Apr 16 20:16:49.009233 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:49.009206 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fphk8_1b5d2585-0759-49e0-8726-9b1f8902ebcf/node-ca/0.log" Apr 16 20:16:49.735810 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:49.735773 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-77c5ff64cd-kkjnk_a0b4741a-f02e-48e4-a491-d2ae897b44dd/router/0.log" Apr 16 20:16:50.061960 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:50.061880 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k262c_994f79b6-31dc-4b4f-8c42-e6e60bee90cf/serve-healthcheck-canary/0.log" Apr 16 20:16:50.442448 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:50.442407 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-k4s7x_c7b46a8f-9a8f-42e0-971b-334f467cc56f/insights-operator/0.log" Apr 16 20:16:50.442682 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:50.442520 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-k4s7x_c7b46a8f-9a8f-42e0-971b-334f467cc56f/insights-operator/1.log" Apr 16 20:16:50.589449 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:50.589415 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zdtb2_dc451bbf-6087-4118-8c46-7dd3dde99f7a/kube-rbac-proxy/0.log" Apr 16 20:16:50.610531 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:50.610497 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zdtb2_dc451bbf-6087-4118-8c46-7dd3dde99f7a/exporter/0.log" Apr 16 20:16:50.633207 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:50.633177 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zdtb2_dc451bbf-6087-4118-8c46-7dd3dde99f7a/extractor/0.log" Apr 16 20:16:52.574880 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:52.574848 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-vgm8q_72c21fef-3270-41f9-988e-35b6ea77cbc0/manager/0.log" Apr 16 20:16:54.435197 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:54.435167 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-trwlw" Apr 16 20:16:57.052506 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:57.052465 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jxvqq_9a0704cd-b28c-4d5b-9e72-79fcd84527b4/kube-storage-version-migrator-operator/1.log" Apr 16 20:16:57.053303 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:57.053283 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jxvqq_9a0704cd-b28c-4d5b-9e72-79fcd84527b4/kube-storage-version-migrator-operator/0.log" Apr 16 20:16:58.000333 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.000300 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2987n_f9d926bb-1dbb-44e0-981e-4bc43df8b1e0/kube-multus-additional-cni-plugins/0.log" Apr 16 20:16:58.021905 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.021868 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2987n_f9d926bb-1dbb-44e0-981e-4bc43df8b1e0/egress-router-binary-copy/0.log" Apr 16 20:16:58.043407 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.043375 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2987n_f9d926bb-1dbb-44e0-981e-4bc43df8b1e0/cni-plugins/0.log" Apr 16 20:16:58.065098 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.065064 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2987n_f9d926bb-1dbb-44e0-981e-4bc43df8b1e0/bond-cni-plugin/0.log" Apr 16 20:16:58.089353 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.089321 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2987n_f9d926bb-1dbb-44e0-981e-4bc43df8b1e0/routeoverride-cni/0.log" Apr 16 20:16:58.110907 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.110867 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2987n_f9d926bb-1dbb-44e0-981e-4bc43df8b1e0/whereabouts-cni-bincopy/0.log" Apr 16 20:16:58.132692 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.132651 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2987n_f9d926bb-1dbb-44e0-981e-4bc43df8b1e0/whereabouts-cni/0.log" Apr 16 20:16:58.492910 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.492880 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-km4f6_1c49a39d-084e-4d78-9c37-a08591619477/kube-multus/0.log" Apr 16 20:16:58.543490 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.543465 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8jmq5_10356841-c032-4d12-8328-dc3aeb909c86/network-metrics-daemon/0.log" Apr 16 20:16:58.566031 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:16:58.566004 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8jmq5_10356841-c032-4d12-8328-dc3aeb909c86/kube-rbac-proxy/0.log" Apr 16 20:17:00.050813 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:00.050780 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pp78x_c61ab98e-2fe3-48e1-b144-0d44e1856354/ovn-controller/0.log" Apr 16 20:17:00.079221 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:00.079190 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pp78x_c61ab98e-2fe3-48e1-b144-0d44e1856354/ovn-acl-logging/0.log" Apr 16 20:17:00.100427 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:00.100398 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pp78x_c61ab98e-2fe3-48e1-b144-0d44e1856354/kube-rbac-proxy-node/0.log" Apr 16 20:17:00.124387 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:00.124350 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pp78x_c61ab98e-2fe3-48e1-b144-0d44e1856354/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:17:00.151070 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:00.151037 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pp78x_c61ab98e-2fe3-48e1-b144-0d44e1856354/northd/0.log" Apr 16 20:17:00.175293 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:00.175265 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pp78x_c61ab98e-2fe3-48e1-b144-0d44e1856354/nbdb/0.log" Apr 16 20:17:00.200283 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:00.200258 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pp78x_c61ab98e-2fe3-48e1-b144-0d44e1856354/sbdb/0.log" Apr 16 20:17:00.311724 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:00.311639 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pp78x_c61ab98e-2fe3-48e1-b144-0d44e1856354/ovnkube-controller/0.log" Apr 16 20:17:01.297053 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:01.297021 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-wjmcc_8bf6ef5e-33ca-46d3-84d9-c703a6a9dea4/check-endpoints/0.log" Apr 16 20:17:01.321531 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:01.321504 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-d724r_a0b5f3c5-7848-4283-b7e8-31a5e5f79888/network-check-target-container/0.log" Apr 16 20:17:02.241247 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:02.241215 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-b4wgn_5bed9951-49d1-4612-b89d-05332f7e56e2/iptables-alerter/0.log" Apr 16 20:17:02.885075 ip-10-0-139-205 kubenswrapper[2569]: I0416 20:17:02.885043 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-pxv56_60894bbb-9d97-4deb-b1de-d69609701101/tuned/0.log"