Apr 20 20:02:28.178004 ip-10-0-131-234 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 20:02:28.178019 ip-10-0-131-234 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 20:02:28.178028 ip-10-0-131-234 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 20:02:28.178328 ip-10-0-131-234 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 20:02:38.221518 ip-10-0-131-234 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 20:02:38.221534 ip-10-0-131-234 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a9886dcc928f4bb8b1e3ee0c6d1890a7 -- Apr 20 20:05:17.742327 ip-10-0-131-234 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:05:18.198396 ip-10-0-131-234 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:18.198396 ip-10-0-131-234 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:05:18.198396 ip-10-0-131-234 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:18.198396 ip-10-0-131-234 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:05:18.198396 ip-10-0-131-234 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:18.201182 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.201092 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:05:18.205528 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205500 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:18.205528 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205521 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:18.205528 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205528 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:18.205528 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205532 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:18.205528 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205536 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205540 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205545 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205548 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205552 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205556 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205560 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205564 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205568 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205572 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205576 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205580 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205584 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205587 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205591 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205595 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205598 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205602 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205605 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:18.205824 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205614 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205618 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205622 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205626 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205630 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205634 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205638 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205642 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205646 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205650 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205654 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205658 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205662 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205666 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205670 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205674 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205679 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205688 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205693 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:18.206657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205699 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205703 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205708 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205713 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205717 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205721 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205725 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205729 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205734 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205738 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205745 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205752 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205757 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205761 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205766 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205770 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205774 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205778 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205782 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:18.207332 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205786 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205791 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205795 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205800 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205804 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205808 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205812 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205819 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205823 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205827 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205832 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205836 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205841 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205845 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205849 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205853 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205857 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205863 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205867 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205872 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:18.207853 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205876 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205880 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205884 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205887 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.205892 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206544 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206555 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206563 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206569 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206573 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206578 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206583 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206588 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206593 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206597 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206601 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206605 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206609 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206613 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206618 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:18.208712 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206622 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206627 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206631 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206635 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206639 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206643 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206647 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206651 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206655 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206660 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206665 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206676 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206681 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206687 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206693 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206697 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206702 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206706 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206711 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:18.209451 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206715 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206719 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206723 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206727 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206731 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206735 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206740 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206744 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206748 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206752 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206757 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206761 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206765 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206769 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206774 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206778 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206782 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206787 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206791 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206795 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:18.209925 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206799 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206803 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206807 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206812 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206816 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206820 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206824 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206828 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206833 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206838 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206842 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206846 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206850 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206854 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206859 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206864 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206868 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206872 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206876 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206881 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:18.210529 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206884 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206888 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206892 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206896 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206900 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206904 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206909 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206913 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206918 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206922 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206926 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.206930 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207747 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207763 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207774 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207781 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207788 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207794 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207801 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207807 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207812 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:05:18.211288 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207817 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207823 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207828 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207833 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207838 2574 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207843 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207848 2574 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207853 2574 flags.go:64] FLAG: --cloud-config="" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207857 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207862 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207868 2574 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207873 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207878 2574 flags.go:64] FLAG: --config-dir="" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207883 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207888 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207894 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207899 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207926 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207934 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207940 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207945 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207950 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207955 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207960 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207966 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:05:18.211843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207972 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207977 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207981 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207987 2574 flags.go:64] FLAG: --enable-server="true" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.207992 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208000 2574 flags.go:64] FLAG: --event-burst="100" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208005 2574 flags.go:64] FLAG: --event-qps="50" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208011 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208016 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208021 2574 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208027 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208052 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208059 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208063 2574 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208068 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208073 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208078 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208082 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208087 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208092 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208096 2574 flags.go:64] FLAG: --feature-gates="" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208103 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208107 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208113 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208118 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208123 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:05:18.212555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208128 2574 flags.go:64] FLAG: --help="false" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208133 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208138 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208142 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208147 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208153 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208160 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208164 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208169 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208174 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208179 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208184 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208189 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208210 2574 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208215 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208220 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208225 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208230 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208235 2574 flags.go:64] FLAG: --lock-file="" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208239 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208244 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208249 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208265 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:05:18.213244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208270 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208275 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208280 2574 flags.go:64] FLAG: --logging-format="text" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208285 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208290 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208295 2574 flags.go:64] FLAG: --manifest-url="" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208299 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208306 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208311 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208318 2574 flags.go:64] FLAG: --max-pods="110" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208323 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208328 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208332 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208337 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208341 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208347 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208351 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208364 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208369 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208374 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208379 2574 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208384 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208394 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208399 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:05:18.213816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208404 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208409 2574 flags.go:64] FLAG: --port="10250" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208414 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208418 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cd4dc574daa6c905" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208423 2574 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208428 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208433 2574 flags.go:64] FLAG: --register-node="true" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208437 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208442 2574 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208448 2574 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208452 2574 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208457 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208461 2574 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208467 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208472 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208476 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208481 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208485 2574 flags.go:64] FLAG: --runonce="false" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208490 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208495 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208499 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208504 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208508 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208513 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208518 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208524 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:05:18.214425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208528 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208533 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208539 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208544 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208551 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208556 2574 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208561 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208570 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208574 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208579 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208585 2574 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208589 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208594 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208598 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208603 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208608 2574 flags.go:64] FLAG: --v="2" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208615 2574 flags.go:64] FLAG: --version="false" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208621 2574 flags.go:64] FLAG: --vmodule="" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208627 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.208633 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208790 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208798 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208803 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208808 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:18.215125 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208812 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208816 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208821 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208825 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208829 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208834 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208838 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208843 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208847 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208851 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208857 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208861 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208867 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208871 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208875 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208879 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208884 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208888 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208892 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208896 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:18.215785 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208900 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208904 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208908 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208912 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208916 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208920 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208925 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208929 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208933 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208937 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208941 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208945 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208949 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208953 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208957 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208961 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208965 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208970 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208974 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:18.216358 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208980 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208986 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208991 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.208998 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209002 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209007 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209013 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209017 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209022 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209028 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209053 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209058 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209062 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209066 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209070 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209074 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209078 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209082 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209086 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209090 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:18.216828 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209094 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209099 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209103 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209106 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209110 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209115 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209119 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209123 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209127 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209131 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209135 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209139 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209144 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209148 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209152 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209158 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209162 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209166 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209170 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209176 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:18.217335 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209181 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209185 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.209189 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.210001 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.217349 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.217366 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217419 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217425 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217429 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217432 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217435 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217438 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217441 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217444 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217447 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217449 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:18.217823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217452 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217454 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217457 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217460 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217462 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217465 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217469 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217473 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217476 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217483 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217486 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217488 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217491 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217494 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217496 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217499 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217502 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217504 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217507 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:18.218279 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217510 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217512 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217522 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217525 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217528 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217531 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217533 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217536 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217538 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217541 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217543 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217545 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217548 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217550 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217553 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217555 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217558 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217560 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217562 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217565 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:18.218745 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217567 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217570 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217572 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217575 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217577 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217579 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217582 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217584 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217587 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217589 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217592 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217594 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217596 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217599 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217603 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217607 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217610 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217613 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217615 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:18.219275 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217617 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217620 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217622 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217625 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217627 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217630 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217632 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217635 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217637 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217639 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217642 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217644 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217647 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217650 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217652 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217654 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217657 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:18.219746 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217659 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.217664 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217763 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217768 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217771 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217774 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217777 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217780 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217782 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217785 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217788 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217790 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217793 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217797 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217799 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217802 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:18.220199 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217804 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217808 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217811 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217814 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217816 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217819 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217822 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217824 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217827 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217830 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217833 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217835 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217838 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217840 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217842 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217845 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217848 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217850 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217853 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217855 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:18.220604 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217858 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217860 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217863 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217865 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217868 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217870 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217873 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217875 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217878 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217883 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217886 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217889 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217892 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217894 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217897 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217900 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217902 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217905 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217907 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:18.221164 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217910 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217913 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217915 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217917 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217920 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217922 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217925 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217927 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217930 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217932 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217935 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217937 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217940 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217942 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217945 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217947 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217950 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217952 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217955 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217958 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:18.221655 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217961 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217963 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217965 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217968 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217971 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217973 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217976 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217978 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217981 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217983 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217986 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217988 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:18.217991 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.217996 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:18.222171 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.218704 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:05:18.222520 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.222327 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:05:18.223345 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.223333 2574 server.go:1019] "Starting client certificate rotation" Apr 20 20:05:18.223450 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.223432 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:18.223510 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.223476 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:18.247787 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.247765 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:18.250252 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.250233 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:18.264583 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.264563 2574 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:05:18.270076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.270061 2574 log.go:25] "Validated CRI v1 image API" Apr 20 20:05:18.272126 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.272110 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:05:18.276728 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.276702 2574 fs.go:135] Filesystem UUIDs: map[4a87e79a-9713-42a1-8b5d-b95eec524a5b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 844bc6d3-6a13-408e-8742-30e77d377063:/dev/nvme0n1p3] Apr 20 20:05:18.276796 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.276723 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:05:18.282539 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.282438 2574 manager.go:217] Machine: {Timestamp:2026-04-20 20:05:18.280299199 +0000 UTC m=+0.403992278 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3108429 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec288c15bd9854911390701047785907 SystemUUID:ec288c15-bd98-5491-1390-701047785907 BootID:a9886dcc-928f-4bb8-b1e3-ee0c6d1890a7 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:77:83:cf:38:89 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:77:83:cf:38:89 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:e9:33:1b:99:13 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:05:18.283211 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.283201 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:05:18.283296 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.283284 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:05:18.284172 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.284151 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:05:18.284330 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.284175 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-234.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:05:18.284379 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.284340 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:05:18.284379 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.284348 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:05:18.284379 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.284361 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:18.285241 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.285231 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:18.286660 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.286649 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:18.286791 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.286782 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:05:18.288905 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.288895 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:05:18.288946 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.288909 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:05:18.288946 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.288927 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:05:18.288946 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.288936 2574 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:05:18.288946 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.288945 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:05:18.290150 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.290128 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:18.290150 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.290153 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:18.290367 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.290351 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:18.293655 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.293608 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:05:18.297112 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.297090 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:05:18.298890 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298875 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298898 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298909 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298917 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298925 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298931 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298937 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298942 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298949 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:05:18.298957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298956 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:05:18.299238 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298965 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:05:18.299238 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.298974 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:05:18.299772 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.299762 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:05:18.299805 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.299773 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:05:18.303508 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.303490 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:05:18.303583 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.303514 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-234.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:05:18.303583 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.303525 2574 server.go:1295] "Started kubelet" Apr 20 20:05:18.303671 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.303604 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:05:18.303708 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.303683 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:05:18.303742 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.303697 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:05:18.303789 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.303738 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:05:18.303831 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.303802 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:05:18.304788 ip-10-0-131-234 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:05:18.304907 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.304796 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:05:18.304971 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.304951 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:05:18.309329 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.309307 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:18.310157 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.309870 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:05:18.310758 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.310737 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:05:18.310851 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.310774 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:05:18.311497 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.311477 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:05:18.311582 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.311542 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:05:18.311582 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.311551 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:05:18.312944 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.312924 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:18.314370 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.314335 2574 factory.go:55] Registering systemd factory Apr 20 20:05:18.314370 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.314351 2574 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:05:18.314687 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.314671 2574 factory.go:153] Registering CRI-O factory Apr 20 20:05:18.314741 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.314690 2574 factory.go:223] Registration of the crio container factory successfully Apr 20 20:05:18.314813 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.314802 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:05:18.314851 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.314829 2574 factory.go:103] Registering Raw factory Apr 20 20:05:18.314879 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.314852 2574 manager.go:1196] Started watching for new ooms in manager Apr 20 20:05:18.315193 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.315155 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:05:18.315292 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.315230 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:05:18.315364 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.315344 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:05:18.315861 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.315845 2574 manager.go:319] Starting recovery of all containers Apr 20 20:05:18.315943 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.314872 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-234.ec2.internal.18a82959b7678bd8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-234.ec2.internal,UID:ip-10-0-131-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-234.ec2.internal,},FirstTimestamp:2026-04-20 20:05:18.303505368 +0000 UTC m=+0.427198449,LastTimestamp:2026-04-20 20:05:18.303505368 +0000 UTC m=+0.427198449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-234.ec2.internal,}" Apr 20 20:05:18.326778 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.326741 2574 manager.go:324] Recovery completed Apr 20 20:05:18.328624 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.328606 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 20:05:18.331514 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.331501 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.334332 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.334310 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.334408 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.334350 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.334408 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.334361 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.334873 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.334854 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:05:18.334873 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.334870 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:05:18.334984 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.334885 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:18.336367 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.336298 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-234.ec2.internal.18a82959b93df2f9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-234.ec2.internal,UID:ip-10-0-131-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-234.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-234.ec2.internal,},FirstTimestamp:2026-04-20 20:05:18.334333689 +0000 UTC m=+0.458026775,LastTimestamp:2026-04-20 20:05:18.334333689 +0000 UTC m=+0.458026775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-234.ec2.internal,}" Apr 20 20:05:18.337062 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.337049 2574 policy_none.go:49] "None policy: Start" Apr 20 20:05:18.337107 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.337067 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:05:18.337107 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.337078 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:05:18.343583 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.343565 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9kwfl" Apr 20 20:05:18.346171 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.346102 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-234.ec2.internal.18a82959b93e485a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-234.ec2.internal,UID:ip-10-0-131-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-234.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-234.ec2.internal,},FirstTimestamp:2026-04-20 20:05:18.334355546 +0000 UTC m=+0.458048625,LastTimestamp:2026-04-20 20:05:18.334355546 +0000 UTC m=+0.458048625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-234.ec2.internal,}" Apr 20 20:05:18.354733 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.354669 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-234.ec2.internal.18a82959b93e706c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-234.ec2.internal,UID:ip-10-0-131-234.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-234.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-234.ec2.internal,},FirstTimestamp:2026-04-20 20:05:18.334365804 +0000 UTC m=+0.458058884,LastTimestamp:2026-04-20 20:05:18.334365804 +0000 UTC m=+0.458058884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-234.ec2.internal,}" Apr 20 20:05:18.358304 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.358286 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9kwfl" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.376561 2574 manager.go:341] "Starting Device Plugin manager" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.376598 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.376611 2574 server.go:85] "Starting device plugin registration server" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.376885 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.376918 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.376998 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.377107 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.377116 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.377510 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:05:18.393143 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.377550 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:18.459028 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.458961 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:05:18.460312 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.460295 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:05:18.460376 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.460324 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:05:18.460376 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.460343 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:05:18.460376 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.460351 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:05:18.460477 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.460442 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:05:18.462637 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.462616 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:18.477521 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.477498 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.478298 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.478277 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.478381 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.478310 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.478381 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.478325 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.478381 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.478351 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.484136 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.484123 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.484182 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.484144 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-234.ec2.internal\": node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:18.501807 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.501787 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:18.561073 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.561023 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal"] Apr 20 20:05:18.561178 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.561108 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.563589 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.563571 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.563666 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.563601 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.563666 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.563611 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.565158 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565146 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.565301 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565286 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.565344 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565316 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.565832 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565818 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.565832 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565825 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.565959 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565842 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.565959 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565851 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.565959 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565862 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.565959 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.565853 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.567310 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.567294 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.567389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.567323 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.568019 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.568003 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.568095 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.568048 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.568095 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.568064 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.602366 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.602338 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:18.603710 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.603695 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-234.ec2.internal\" not found" node="ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.608108 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.608085 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-234.ec2.internal\" not found" node="ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.702741 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.702718 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:18.713056 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.713003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f8094a2875efe3934931c2701094dd6e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-234.ec2.internal\" (UID: \"f8094a2875efe3934931c2701094dd6e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.713120 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.713095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7af4a2209687b835d6eb69cc49ab6739-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal\" (UID: \"7af4a2209687b835d6eb69cc49ab6739\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.713162 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.713124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7af4a2209687b835d6eb69cc49ab6739-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal\" (UID: \"7af4a2209687b835d6eb69cc49ab6739\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.803418 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.803384 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:18.813782 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.813759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f8094a2875efe3934931c2701094dd6e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-234.ec2.internal\" (UID: \"f8094a2875efe3934931c2701094dd6e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.813862 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.813787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7af4a2209687b835d6eb69cc49ab6739-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal\" (UID: \"7af4a2209687b835d6eb69cc49ab6739\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.813862 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.813805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7af4a2209687b835d6eb69cc49ab6739-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal\" (UID: \"7af4a2209687b835d6eb69cc49ab6739\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.813862 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.813832 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7af4a2209687b835d6eb69cc49ab6739-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal\" (UID: \"7af4a2209687b835d6eb69cc49ab6739\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.813959 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.813865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f8094a2875efe3934931c2701094dd6e-config\") pod \"kube-apiserver-proxy-ip-10-0-131-234.ec2.internal\" (UID: \"f8094a2875efe3934931c2701094dd6e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.813959 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.813883 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7af4a2209687b835d6eb69cc49ab6739-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal\" (UID: \"7af4a2209687b835d6eb69cc49ab6739\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.903957 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:18.903915 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:18.906116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.906100 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:18.910648 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:18.910629 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" Apr 20 20:05:19.004432 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:19.004347 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:19.104914 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:19.104888 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:19.200691 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.200672 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:19.205092 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:19.205077 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:19.223679 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.223659 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:05:19.223784 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.223768 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:19.223843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.223819 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:19.305842 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:19.305820 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-234.ec2.internal\" not found" Apr 20 20:05:19.309431 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.309415 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:19.320764 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.320745 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:19.345329 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.345305 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w8tsh" Apr 20 20:05:19.353153 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.352999 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w8tsh" Apr 20 20:05:19.361255 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.361227 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:00:18 +0000 UTC" deadline="2028-01-07 11:22:27.72601497 +0000 UTC" Apr 20 20:05:19.361255 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.361251 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15039h17m8.364767031s" Apr 20 20:05:19.401825 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.401801 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:19.414096 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.414066 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" Apr 20 20:05:19.425328 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.425310 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:19.426785 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.426769 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" Apr 20 20:05:19.435571 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.435553 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:19.454910 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:19.454885 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8094a2875efe3934931c2701094dd6e.slice/crio-50122711d777b649b9a4d5bc64fa854c5cb90de44c71927559bcb0daf8177f12 WatchSource:0}: Error finding container 50122711d777b649b9a4d5bc64fa854c5cb90de44c71927559bcb0daf8177f12: Status 404 returned error can't find the container with id 50122711d777b649b9a4d5bc64fa854c5cb90de44c71927559bcb0daf8177f12 Apr 20 20:05:19.455251 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:19.455231 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af4a2209687b835d6eb69cc49ab6739.slice/crio-bed3094085bba22e8d5bfb6ffc57f0bf4aae1d64db1cf38a750522472f8372d5 WatchSource:0}: Error finding container bed3094085bba22e8d5bfb6ffc57f0bf4aae1d64db1cf38a750522472f8372d5: Status 404 returned error can't find the container with id bed3094085bba22e8d5bfb6ffc57f0bf4aae1d64db1cf38a750522472f8372d5 Apr 20 20:05:19.462202 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.462176 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:05:19.463227 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.463160 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" event={"ID":"f8094a2875efe3934931c2701094dd6e","Type":"ContainerStarted","Data":"50122711d777b649b9a4d5bc64fa854c5cb90de44c71927559bcb0daf8177f12"} Apr 20 20:05:19.464076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.464057 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" event={"ID":"7af4a2209687b835d6eb69cc49ab6739","Type":"ContainerStarted","Data":"bed3094085bba22e8d5bfb6ffc57f0bf4aae1d64db1cf38a750522472f8372d5"} Apr 20 20:05:19.879628 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:19.879591 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:20.065449 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.065417 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:20.290877 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.290844 2574 apiserver.go:52] "Watching apiserver" Apr 20 20:05:20.296789 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.296759 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:05:20.298958 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.298934 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ssfnj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal","openshift-multus/multus-4wcrc","openshift-multus/multus-additional-cni-plugins-lcb2v","openshift-multus/network-metrics-daemon-wktd8","kube-system/konnectivity-agent-q6qs6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks","openshift-network-diagnostics/network-check-target-5kgnf","openshift-network-operator/iptables-alerter-d4qvq","openshift-ovn-kubernetes/ovnkube-node-j6wvn","kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal","openshift-cluster-node-tuning-operator/tuned-j65bj","openshift-dns/node-resolver-d4mxm"] Apr 20 20:05:20.300917 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.300895 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.302100 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.302017 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.304695 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.304259 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.304695 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.304360 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:20.304695 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.304448 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:20.306701 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.306053 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:20.306957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.306934 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:05:20.306957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.306946 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.307177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.307061 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-74xqh\"" Apr 20 20:05:20.307309 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.307292 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.307391 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.307306 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:05:20.307391 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.307365 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.307669 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.307652 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7xljf\"" Apr 20 20:05:20.307745 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.307676 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.307745 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.307655 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:05:20.307844 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.307683 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:05:20.308070 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.308028 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:05:20.308148 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.308066 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7q7bq\"" Apr 20 20:05:20.308214 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.308197 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:05:20.308267 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.308216 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:05:20.308893 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.308555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.308893 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.308633 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:20.308893 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.308691 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:20.310148 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.309817 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.310459 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.310437 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qlnrh\"" Apr 20 20:05:20.310706 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.310692 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.311116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.311022 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:05:20.311116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.311068 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.311244 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.311222 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-s66wg\"" Apr 20 20:05:20.311822 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.311683 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.313190 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.313167 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:05:20.313521 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.313493 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.315536 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.315514 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.315822 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.315802 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nh28c\"" Apr 20 20:05:20.316812 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.316430 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:05:20.316812 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.316603 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.318999 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.318978 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.319282 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.319263 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:05:20.319366 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.319293 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:05:20.319700 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.319647 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.319857 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.319650 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:05:20.319857 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.319837 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.320612 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320590 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-env-overrides\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.320697 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovnkube-script-lib\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.320697 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-cni-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.320697 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320671 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-etc-kubernetes\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.320853 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-kubernetes\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.320853 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:20.320853 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-run-netns\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.320853 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsz8m\" (UniqueName: \"kubernetes.io/projected/0ce4f429-5df3-4576-bea7-50ab8358d9f7-kube-api-access-lsz8m\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86459264-fd91-425e-8338-70b56d469a74-host\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320910 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-system-cni-dir\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysctl-d\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.320979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-registration-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-systemd\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321026 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-etc-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321063 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s4fh9\"" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321066 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-k8s-cni-cncf-io\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-cnibin\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.321118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321117 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6thv\" (UniqueName: \"kubernetes.io/projected/86459264-fd91-425e-8338-70b56d469a74-kube-api-access-g6thv\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321138 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-kubelet\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-ovn\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-node-log\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-cni-netd\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321265 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-sys\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321323 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-tuned\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321382 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-socket-dir-parent\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321401 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-cni-bin\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-daemon-config\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-modprobe-d\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysconfig\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.321526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-systemd\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovnkube-config\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f679dfd5-8c86-42f8-823c-e2c7b58decdf-cni-binary-copy\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321653 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfg6x\" (UniqueName: \"kubernetes.io/projected/d62d535b-7b78-4f80-8451-fabdfce754d7-kube-api-access-sfg6x\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-run-ovn-kubernetes\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321731 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-system-cni-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-cnibin\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321782 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-hostroot\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-conf-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf1e798d-74ef-4682-bd04-15da759fea59-host-slash\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-var-lib-kubelet\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.321889 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321889 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-device-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-etc-selinux\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxnj9\" (UniqueName: \"kubernetes.io/projected/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-kube-api-access-lxnj9\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-systemd-units\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.321989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-cni-multus\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bf1e798d-74ef-4682-bd04-15da759fea59-iptables-alerter-script\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322115 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-netns\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-run\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322187 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322270 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09d0b535-704a-4945-9235-0ddeba8ad00c-agent-certs\") pod \"konnectivity-agent-q6qs6\" (UID: \"09d0b535-704a-4945-9235-0ddeba8ad00c\") " pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq7k9\" (UniqueName: \"kubernetes.io/projected/eff2e20b-0b3f-4623-b2e9-68404cf5689f-kube-api-access-kq7k9\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322324 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mp75\" (UniqueName: \"kubernetes.io/projected/bf1e798d-74ef-4682-bd04-15da759fea59-kube-api-access-2mp75\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.322357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysctl-conf\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322371 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-socket-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58c8s\" (UniqueName: \"kubernetes.io/projected/184c92c6-a188-47c2-acbf-e9fe477d6c13-kube-api-access-58c8s\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322404 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-kubelet\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-slash\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322490 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-5pn6f\"" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-log-socket\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322551 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-os-release\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322475 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95czx\" (UniqueName: \"kubernetes.io/projected/f679dfd5-8c86-42f8-823c-e2c7b58decdf-kube-api-access-95czx\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322604 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-host\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-multus-certs\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322660 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zjqw7\"" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-os-release\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322708 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322728 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322734 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.323068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-lib-modules\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.323980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09d0b535-704a-4945-9235-0ddeba8ad00c-konnectivity-ca\") pod \"konnectivity-agent-q6qs6\" (UID: \"09d0b535-704a-4945-9235-0ddeba8ad00c\") " pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:20.323980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-cni-bin\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.323980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovn-node-metrics-cert\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.323980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ce4f429-5df3-4576-bea7-50ab8358d9f7-tmp\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.323980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-sys-fs\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.323980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322941 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86459264-fd91-425e-8338-70b56d469a74-serviceca\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.323980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.322986 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-var-lib-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.323980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.323022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.354852 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.354569 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:19 +0000 UTC" deadline="2028-01-14 17:56:48.409144672 +0000 UTC" Apr 20 20:05:20.354852 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.354597 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15213h51m28.054551058s" Apr 20 20:05:20.412790 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.412763 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:05:20.423246 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-modprobe-d\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.423389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysconfig\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.423389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-systemd\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.423389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovnkube-config\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.423389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f679dfd5-8c86-42f8-823c-e2c7b58decdf-cni-binary-copy\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.423389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423370 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysconfig\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.423389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-systemd\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.423679 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423396 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-modprobe-d\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.423894 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfg6x\" (UniqueName: \"kubernetes.io/projected/d62d535b-7b78-4f80-8451-fabdfce754d7-kube-api-access-sfg6x\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.423984 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-run-ovn-kubernetes\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.423984 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423929 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-system-cni-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.423984 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-cnibin\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.423984 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423965 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovnkube-config\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.423984 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.423980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-hostroot\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-conf-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424009 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-run-ovn-kubernetes\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424050 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-system-cni-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-hostroot\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-tmp-dir\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-cnibin\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-conf-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf1e798d-74ef-4682-bd04-15da759fea59-host-slash\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf1e798d-74ef-4682-bd04-15da759fea59-host-slash\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-var-lib-kubelet\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424130 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f679dfd5-8c86-42f8-823c-e2c7b58decdf-cni-binary-copy\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424204 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-device-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-etc-selinux\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.424229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxnj9\" (UniqueName: \"kubernetes.io/projected/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-kube-api-access-lxnj9\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-systemd-units\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-cni-multus\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-device-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424217 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-var-lib-kubelet\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bf1e798d-74ef-4682-bd04-15da759fea59-iptables-alerter-script\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-etc-selinux\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-systemd-units\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424325 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-cni-multus\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-netns\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424446 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-run\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-netns\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09d0b535-704a-4945-9235-0ddeba8ad00c-agent-certs\") pod \"konnectivity-agent-q6qs6\" (UID: \"09d0b535-704a-4945-9235-0ddeba8ad00c\") " pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq7k9\" (UniqueName: \"kubernetes.io/projected/eff2e20b-0b3f-4623-b2e9-68404cf5689f-kube-api-access-kq7k9\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.424842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424577 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mp75\" (UniqueName: \"kubernetes.io/projected/bf1e798d-74ef-4682-bd04-15da759fea59-kube-api-access-2mp75\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysctl-conf\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-socket-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58c8s\" (UniqueName: \"kubernetes.io/projected/184c92c6-a188-47c2-acbf-e9fe477d6c13-kube-api-access-58c8s\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-run\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-kubelet\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-kubelet\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-slash\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bf1e798d-74ef-4682-bd04-15da759fea59-iptables-alerter-script\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424756 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-log-socket\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424804 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-slash\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424826 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-os-release\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424833 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95czx\" (UniqueName: \"kubernetes.io/projected/f679dfd5-8c86-42f8-823c-e2c7b58decdf-kube-api-access-95czx\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-host\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.425645 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424899 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-socket-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-multus-certs\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-os-release\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.424997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysctl-conf\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-lib-modules\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-log-socket\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09d0b535-704a-4945-9235-0ddeba8ad00c-konnectivity-ca\") pod \"konnectivity-agent-q6qs6\" (UID: \"09d0b535-704a-4945-9235-0ddeba8ad00c\") " pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-cni-bin\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425119 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovn-node-metrics-cert\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425133 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-os-release\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425144 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-os-release\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ce4f429-5df3-4576-bea7-50ab8358d9f7-tmp\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-sys-fs\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-host\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.426453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86459264-fd91-425e-8338-70b56d469a74-serviceca\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-var-lib-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-env-overrides\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovnkube-script-lib\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-cni-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-etc-kubernetes\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-hosts-file\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz86s\" (UniqueName: \"kubernetes.io/projected/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-kube-api-access-pz86s\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-kubernetes\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425451 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425473 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-run-netns\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsz8m\" (UniqueName: \"kubernetes.io/projected/0ce4f429-5df3-4576-bea7-50ab8358d9f7-kube-api-access-lsz8m\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86459264-fd91-425e-8338-70b56d469a74-host\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-system-cni-dir\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425568 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysctl-d\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.427245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-registration-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425708 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-systemd\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-etc-kubernetes\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-cni-dir\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-kubernetes\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-lib-modules\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-systemd\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.425909 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425926 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-etc-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-k8s-cni-cncf-io\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.425989 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-k8s-cni-cncf-io\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.426000 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:20.925958972 +0000 UTC m=+3.049652043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426026 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-cnibin\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6thv\" (UniqueName: \"kubernetes.io/projected/86459264-fd91-425e-8338-70b56d469a74-kube-api-access-g6thv\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-etc-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-kubelet\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430177 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-ovn\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426148 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-node-log\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-cni-netd\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-sysctl-d\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-run-multus-certs\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-sys\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-run-netns\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-tuned\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-socket-dir-parent\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86459264-fd91-425e-8338-70b56d469a74-host\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-cni-bin\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-daemon-config\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-system-cni-dir\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-cni-netd\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d62d535b-7b78-4f80-8451-fabdfce754d7-cnibin\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovnkube-script-lib\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426466 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ce4f429-5df3-4576-bea7-50ab8358d9f7-sys\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.430969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426919 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eff2e20b-0b3f-4623-b2e9-68404cf5689f-env-overrides\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-run-ovn\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.426997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-daemon-config\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427003 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-var-lib-openvswitch\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427028 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-kubelet\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-host-cni-bin\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-multus-socket-dir-parent\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86459264-fd91-425e-8338-70b56d469a74-serviceca\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f679dfd5-8c86-42f8-823c-e2c7b58decdf-host-var-lib-cni-bin\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eff2e20b-0b3f-4623-b2e9-68404cf5689f-node-log\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-sys-fs\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427423 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d62d535b-7b78-4f80-8451-fabdfce754d7-cni-binary-copy\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.427465 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-registration-dir\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.429228 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09d0b535-704a-4945-9235-0ddeba8ad00c-konnectivity-ca\") pod \"konnectivity-agent-q6qs6\" (UID: \"09d0b535-704a-4945-9235-0ddeba8ad00c\") " pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.430001 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eff2e20b-0b3f-4623-b2e9-68404cf5689f-ovn-node-metrics-cert\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.430065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09d0b535-704a-4945-9235-0ddeba8ad00c-agent-certs\") pod \"konnectivity-agent-q6qs6\" (UID: \"09d0b535-704a-4945-9235-0ddeba8ad00c\") " pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.431218 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0ce4f429-5df3-4576-bea7-50ab8358d9f7-etc-tuned\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.434161 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.432824 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ce4f429-5df3-4576-bea7-50ab8358d9f7-tmp\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.436742 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.436721 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:20.436882 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.436747 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:20.436882 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.436761 2574 projected.go:194] Error preparing data for projected volume kube-api-access-lpc62 for pod openshift-network-diagnostics/network-check-target-5kgnf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:20.436882 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.436818 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62 podName:655b7db6-852f-4d19-9975-31ad69976609 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:20.936800667 +0000 UTC m=+3.060493746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lpc62" (UniqueName: "kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62") pod "network-check-target-5kgnf" (UID: "655b7db6-852f-4d19-9975-31ad69976609") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:20.437884 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.437859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58c8s\" (UniqueName: \"kubernetes.io/projected/184c92c6-a188-47c2-acbf-e9fe477d6c13-kube-api-access-58c8s\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:20.439560 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.439495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxnj9\" (UniqueName: \"kubernetes.io/projected/e19165a1-00ed-47ec-bfb7-f7b723ee12ac-kube-api-access-lxnj9\") pod \"aws-ebs-csi-driver-node-c9rks\" (UID: \"e19165a1-00ed-47ec-bfb7-f7b723ee12ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.440968 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.440930 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95czx\" (UniqueName: \"kubernetes.io/projected/f679dfd5-8c86-42f8-823c-e2c7b58decdf-kube-api-access-95czx\") pod \"multus-4wcrc\" (UID: \"f679dfd5-8c86-42f8-823c-e2c7b58decdf\") " pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.440968 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.440955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsz8m\" (UniqueName: \"kubernetes.io/projected/0ce4f429-5df3-4576-bea7-50ab8358d9f7-kube-api-access-lsz8m\") pod \"tuned-j65bj\" (UID: \"0ce4f429-5df3-4576-bea7-50ab8358d9f7\") " pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.441435 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.441411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq7k9\" (UniqueName: \"kubernetes.io/projected/eff2e20b-0b3f-4623-b2e9-68404cf5689f-kube-api-access-kq7k9\") pod \"ovnkube-node-j6wvn\" (UID: \"eff2e20b-0b3f-4623-b2e9-68404cf5689f\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.441513 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.441415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6thv\" (UniqueName: \"kubernetes.io/projected/86459264-fd91-425e-8338-70b56d469a74-kube-api-access-g6thv\") pod \"node-ca-ssfnj\" (UID: \"86459264-fd91-425e-8338-70b56d469a74\") " pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.441916 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.441874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mp75\" (UniqueName: \"kubernetes.io/projected/bf1e798d-74ef-4682-bd04-15da759fea59-kube-api-access-2mp75\") pod \"iptables-alerter-d4qvq\" (UID: \"bf1e798d-74ef-4682-bd04-15da759fea59\") " pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.443377 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.443356 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfg6x\" (UniqueName: \"kubernetes.io/projected/d62d535b-7b78-4f80-8451-fabdfce754d7-kube-api-access-sfg6x\") pod \"multus-additional-cni-plugins-lcb2v\" (UID: \"d62d535b-7b78-4f80-8451-fabdfce754d7\") " pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.526780 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.526742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-hosts-file\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.526924 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.526786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pz86s\" (UniqueName: \"kubernetes.io/projected/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-kube-api-access-pz86s\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.526924 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.526826 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-tmp-dir\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.526924 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.526864 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-hosts-file\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.527194 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.527168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-tmp-dir\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.534788 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.534764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz86s\" (UniqueName: \"kubernetes.io/projected/ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28-kube-api-access-pz86s\") pod \"node-resolver-d4mxm\" (UID: \"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28\") " pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.615607 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.615530 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" Apr 20 20:05:20.623297 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.623272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4wcrc" Apr 20 20:05:20.632886 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.632865 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" Apr 20 20:05:20.640485 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.640466 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:20.648387 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.648367 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ssfnj" Apr 20 20:05:20.664949 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.664925 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-d4qvq" Apr 20 20:05:20.672726 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.672701 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:20.680401 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.680375 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d4mxm" Apr 20 20:05:20.686047 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.686003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j65bj" Apr 20 20:05:20.930028 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:20.929948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:20.930192 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.930084 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:20.930192 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:20.930145 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:21.930129137 +0000 UTC m=+4.053822206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:21.030891 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.030861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:21.031072 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:21.030984 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:21.031072 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:21.030999 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:21.031072 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:21.031009 2574 projected.go:194] Error preparing data for projected volume kube-api-access-lpc62 for pod openshift-network-diagnostics/network-check-target-5kgnf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:21.031072 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:21.031073 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62 podName:655b7db6-852f-4d19-9975-31ad69976609 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:22.031059916 +0000 UTC m=+4.154752982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lpc62" (UniqueName: "kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62") pod "network-check-target-5kgnf" (UID: "655b7db6-852f-4d19-9975-31ad69976609") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:21.210412 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.210385 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd62d535b_7b78_4f80_8451_fabdfce754d7.slice/crio-6dc65f2e960f31d17408f7fe8a0305abd736a7157747a3c34ec0eddf54ddc0a6 WatchSource:0}: Error finding container 6dc65f2e960f31d17408f7fe8a0305abd736a7157747a3c34ec0eddf54ddc0a6: Status 404 returned error can't find the container with id 6dc65f2e960f31d17408f7fe8a0305abd736a7157747a3c34ec0eddf54ddc0a6 Apr 20 20:05:21.221294 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.221270 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeff2e20b_0b3f_4623_b2e9_68404cf5689f.slice/crio-2fca565bf1e9b451596fe6fd4f950076064bf378253b29b8a28987bc72dc2553 WatchSource:0}: Error finding container 2fca565bf1e9b451596fe6fd4f950076064bf378253b29b8a28987bc72dc2553: Status 404 returned error can't find the container with id 2fca565bf1e9b451596fe6fd4f950076064bf378253b29b8a28987bc72dc2553 Apr 20 20:05:21.225688 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.225628 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86459264_fd91_425e_8338_70b56d469a74.slice/crio-6f0699bb8cb66a201f2ecc51b4fc94820910ccfca61592b841a21a6903f8275d WatchSource:0}: Error finding container 6f0699bb8cb66a201f2ecc51b4fc94820910ccfca61592b841a21a6903f8275d: Status 404 returned error can't find the container with id 6f0699bb8cb66a201f2ecc51b4fc94820910ccfca61592b841a21a6903f8275d Apr 20 20:05:21.228906 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.228581 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d0b535_704a_4945_9235_0ddeba8ad00c.slice/crio-4f99b982e48daad27aeb69f8a9254d884d738bc53481d4289b17a7a12460e875 WatchSource:0}: Error finding container 4f99b982e48daad27aeb69f8a9254d884d738bc53481d4289b17a7a12460e875: Status 404 returned error can't find the container with id 4f99b982e48daad27aeb69f8a9254d884d738bc53481d4289b17a7a12460e875 Apr 20 20:05:21.229495 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.229472 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf1e798d_74ef_4682_bd04_15da759fea59.slice/crio-ca954cd1685bcb32b3e681e549a03fc4cefbaa811231c775dd5243f5f0094c68 WatchSource:0}: Error finding container ca954cd1685bcb32b3e681e549a03fc4cefbaa811231c775dd5243f5f0094c68: Status 404 returned error can't find the container with id ca954cd1685bcb32b3e681e549a03fc4cefbaa811231c775dd5243f5f0094c68 Apr 20 20:05:21.230527 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.230508 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf679dfd5_8c86_42f8_823c_e2c7b58decdf.slice/crio-79dfcc721b048fbfa5f6e42eec36b85b13f97502ab96bb6cd16187ff5c8397d9 WatchSource:0}: Error finding container 79dfcc721b048fbfa5f6e42eec36b85b13f97502ab96bb6cd16187ff5c8397d9: Status 404 returned error can't find the container with id 79dfcc721b048fbfa5f6e42eec36b85b13f97502ab96bb6cd16187ff5c8397d9 Apr 20 20:05:21.233278 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.231421 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ce4f429_5df3_4576_bea7_50ab8358d9f7.slice/crio-f9e744495c4679a06cbb49390490df99ba1db824e8bff5f3999733c7f504f688 WatchSource:0}: Error finding container f9e744495c4679a06cbb49390490df99ba1db824e8bff5f3999733c7f504f688: Status 404 returned error can't find the container with id f9e744495c4679a06cbb49390490df99ba1db824e8bff5f3999733c7f504f688 Apr 20 20:05:21.233278 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.232927 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19165a1_00ed_47ec_bfb7_f7b723ee12ac.slice/crio-738845c73fb03b2543ec7d25d3443ee3e02a170890841d96e600a172cc8ea3e5 WatchSource:0}: Error finding container 738845c73fb03b2543ec7d25d3443ee3e02a170890841d96e600a172cc8ea3e5: Status 404 returned error can't find the container with id 738845c73fb03b2543ec7d25d3443ee3e02a170890841d96e600a172cc8ea3e5 Apr 20 20:05:21.234490 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:05:21.234417 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebb6ab8a_7ed0_48ce_b19d_ba7a095c2e28.slice/crio-327e489987d7de7a8818b4dcd3f587638d1a13f372932de4da7d80b56988244e WatchSource:0}: Error finding container 327e489987d7de7a8818b4dcd3f587638d1a13f372932de4da7d80b56988244e: Status 404 returned error can't find the container with id 327e489987d7de7a8818b4dcd3f587638d1a13f372932de4da7d80b56988244e Apr 20 20:05:21.355517 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.355488 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:19 +0000 UTC" deadline="2028-01-26 03:13:13.41363945 +0000 UTC" Apr 20 20:05:21.355824 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.355522 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15487h7m52.05811954s" Apr 20 20:05:21.467922 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.467849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" event={"ID":"f8094a2875efe3934931c2701094dd6e","Type":"ContainerStarted","Data":"057391c77602104b40621a989868a99797fb62e4246dc05739df2e1d5ecf3e08"} Apr 20 20:05:21.468862 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.468834 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d4mxm" event={"ID":"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28","Type":"ContainerStarted","Data":"327e489987d7de7a8818b4dcd3f587638d1a13f372932de4da7d80b56988244e"} Apr 20 20:05:21.469841 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.469823 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-d4qvq" event={"ID":"bf1e798d-74ef-4682-bd04-15da759fea59","Type":"ContainerStarted","Data":"ca954cd1685bcb32b3e681e549a03fc4cefbaa811231c775dd5243f5f0094c68"} Apr 20 20:05:21.471305 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.471279 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q6qs6" event={"ID":"09d0b535-704a-4945-9235-0ddeba8ad00c","Type":"ContainerStarted","Data":"4f99b982e48daad27aeb69f8a9254d884d738bc53481d4289b17a7a12460e875"} Apr 20 20:05:21.472213 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.472192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ssfnj" event={"ID":"86459264-fd91-425e-8338-70b56d469a74","Type":"ContainerStarted","Data":"6f0699bb8cb66a201f2ecc51b4fc94820910ccfca61592b841a21a6903f8275d"} Apr 20 20:05:21.473127 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.473094 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerStarted","Data":"6dc65f2e960f31d17408f7fe8a0305abd736a7157747a3c34ec0eddf54ddc0a6"} Apr 20 20:05:21.473939 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.473922 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" event={"ID":"e19165a1-00ed-47ec-bfb7-f7b723ee12ac","Type":"ContainerStarted","Data":"738845c73fb03b2543ec7d25d3443ee3e02a170890841d96e600a172cc8ea3e5"} Apr 20 20:05:21.474810 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.474790 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j65bj" event={"ID":"0ce4f429-5df3-4576-bea7-50ab8358d9f7","Type":"ContainerStarted","Data":"f9e744495c4679a06cbb49390490df99ba1db824e8bff5f3999733c7f504f688"} Apr 20 20:05:21.475652 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.475634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4wcrc" event={"ID":"f679dfd5-8c86-42f8-823c-e2c7b58decdf","Type":"ContainerStarted","Data":"79dfcc721b048fbfa5f6e42eec36b85b13f97502ab96bb6cd16187ff5c8397d9"} Apr 20 20:05:21.476604 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.476582 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"2fca565bf1e9b451596fe6fd4f950076064bf378253b29b8a28987bc72dc2553"} Apr 20 20:05:21.481458 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.481423 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-234.ec2.internal" podStartSLOduration=2.481413796 podStartE2EDuration="2.481413796s" podCreationTimestamp="2026-04-20 20:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:21.481305286 +0000 UTC m=+3.604998373" watchObservedRunningTime="2026-04-20 20:05:21.481413796 +0000 UTC m=+3.605106884" Apr 20 20:05:21.937805 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:21.937196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:21.937805 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:21.937373 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:21.937805 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:21.937439 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:23.937420295 +0000 UTC m=+6.061113364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:22.037957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:22.037918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:22.038129 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:22.038116 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:22.038203 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:22.038137 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:22.038203 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:22.038149 2574 projected.go:194] Error preparing data for projected volume kube-api-access-lpc62 for pod openshift-network-diagnostics/network-check-target-5kgnf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:22.038327 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:22.038206 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62 podName:655b7db6-852f-4d19-9975-31ad69976609 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:24.038187501 +0000 UTC m=+6.161880582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lpc62" (UniqueName: "kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62") pod "network-check-target-5kgnf" (UID: "655b7db6-852f-4d19-9975-31ad69976609") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:22.047848 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:22.047604 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:22.463816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:22.463734 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:22.464270 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:22.463926 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:22.464432 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:22.464411 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:22.464523 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:22.464502 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:22.495582 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:22.495544 2574 generic.go:358] "Generic (PLEG): container finished" podID="7af4a2209687b835d6eb69cc49ab6739" containerID="cfcb148abf1139d184ce8caa4d2f04493af99181b383a1a2333731450c2615ba" exitCode=0 Apr 20 20:05:22.496610 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:22.496460 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" event={"ID":"7af4a2209687b835d6eb69cc49ab6739","Type":"ContainerDied","Data":"cfcb148abf1139d184ce8caa4d2f04493af99181b383a1a2333731450c2615ba"} Apr 20 20:05:23.513341 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:23.513077 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" event={"ID":"7af4a2209687b835d6eb69cc49ab6739","Type":"ContainerStarted","Data":"89e87b486c70e42d04bf7159c8146d602bb7505676eadda7c0aa239933d690f1"} Apr 20 20:05:23.954574 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:23.954537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:23.954749 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:23.954707 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:23.954805 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:23.954767 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:27.954749892 +0000 UTC m=+10.078442964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:24.055528 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:24.055488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:24.055693 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:24.055636 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:24.055693 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:24.055650 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:24.055693 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:24.055661 2574 projected.go:194] Error preparing data for projected volume kube-api-access-lpc62 for pod openshift-network-diagnostics/network-check-target-5kgnf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:24.055807 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:24.055707 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62 podName:655b7db6-852f-4d19-9975-31ad69976609 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:28.055692735 +0000 UTC m=+10.179385801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lpc62" (UniqueName: "kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62") pod "network-check-target-5kgnf" (UID: "655b7db6-852f-4d19-9975-31ad69976609") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:24.462635 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:24.462608 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:24.462803 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:24.462710 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:24.463024 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:24.463007 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:24.463119 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:24.463101 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:26.464304 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:26.463591 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:26.464304 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:26.463714 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:26.464304 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:26.464157 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:26.464304 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:26.464255 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:27.988029 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:27.987976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:27.988470 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:27.988161 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:27.988470 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:27.988240 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.988220257 +0000 UTC m=+18.111913327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:28.089101 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:28.089061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:28.089387 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:28.089251 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:28.089387 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:28.089276 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:28.089387 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:28.089288 2574 projected.go:194] Error preparing data for projected volume kube-api-access-lpc62 for pod openshift-network-diagnostics/network-check-target-5kgnf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.089387 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:28.089356 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62 podName:655b7db6-852f-4d19-9975-31ad69976609 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:36.089337557 +0000 UTC m=+18.213030639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lpc62" (UniqueName: "kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62") pod "network-check-target-5kgnf" (UID: "655b7db6-852f-4d19-9975-31ad69976609") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.462390 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:28.462353 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:28.462576 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:28.462487 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:28.463190 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:28.462995 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:28.463190 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:28.463139 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:30.461622 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:30.461549 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:30.461622 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:30.461583 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:30.461993 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:30.461658 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:30.461993 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:30.461965 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:32.461474 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:32.461438 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:32.461984 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:32.461486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:32.461984 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:32.461557 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:32.461984 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:32.461625 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:34.461192 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:34.461157 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:34.461598 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:34.461287 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:34.461598 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:34.461346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:34.461598 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:34.461457 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:36.051998 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:36.051957 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:36.052446 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:36.052130 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:36.052446 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:36.052200 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:52.052182188 +0000 UTC m=+34.175875256 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:36.152886 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:36.152844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:36.153082 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:36.153029 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:36.153082 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:36.153065 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:36.153082 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:36.153078 2574 projected.go:194] Error preparing data for projected volume kube-api-access-lpc62 for pod openshift-network-diagnostics/network-check-target-5kgnf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:36.153232 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:36.153141 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62 podName:655b7db6-852f-4d19-9975-31ad69976609 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:52.153123557 +0000 UTC m=+34.276816622 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lpc62" (UniqueName: "kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62") pod "network-check-target-5kgnf" (UID: "655b7db6-852f-4d19-9975-31ad69976609") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:36.461424 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:36.461343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:36.461424 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:36.461384 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:36.461636 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:36.461491 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:36.461636 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:36.461600 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:38.461697 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:38.461670 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:38.462104 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:38.461709 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:38.462104 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:38.461829 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:38.462104 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:38.461916 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:39.548144 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.547946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4wcrc" event={"ID":"f679dfd5-8c86-42f8-823c-e2c7b58decdf","Type":"ContainerStarted","Data":"75a81a00cc2b9584fe1c107f256020cea6ca533b5ae203a86f0806c5ca6ab56b"} Apr 20 20:05:39.550448 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.550430 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:05:39.550718 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.550698 2574 generic.go:358] "Generic (PLEG): container finished" podID="eff2e20b-0b3f-4623-b2e9-68404cf5689f" containerID="09e66a1f5119a15e2d1a3853f803b1157d777c92a1040f2c3b6704faf57fd983" exitCode=1 Apr 20 20:05:39.550776 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.550761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"373db97849553d107ade63b0b52156c6b4a3e053d03727c3b18c785f25d5f454"} Apr 20 20:05:39.550823 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.550778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"64c92b9fb730b41814af1291d3bb1ebca666d81286cc9aef7d6d972ff9db51a2"} Apr 20 20:05:39.550823 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.550789 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"2a4a7f842bbfe0076678125a4a560a1cbe492108a32faaa90b67b0967ff2d7f2"} Apr 20 20:05:39.550823 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.550801 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"7a8e2b80c8d04494efd0154b80389f36850498545d7938a2d2b578aa1890b3eb"} Apr 20 20:05:39.550823 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.550812 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerDied","Data":"09e66a1f5119a15e2d1a3853f803b1157d777c92a1040f2c3b6704faf57fd983"} Apr 20 20:05:39.550998 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.550824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"c0cda9ed954791d59f35fb4ccd5de8f1f2cf09f4ee56adec4092c7d5e543780c"} Apr 20 20:05:39.551899 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.551877 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d4mxm" event={"ID":"ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28","Type":"ContainerStarted","Data":"b00b4ea0d22036500b842f61ab88cee6d8cbf87b620f84f3f30c63da06e18bc7"} Apr 20 20:05:39.552932 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.552913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q6qs6" event={"ID":"09d0b535-704a-4945-9235-0ddeba8ad00c","Type":"ContainerStarted","Data":"9795ad6a3050d94416b1ae8934dd25115972a5100561d5fe66347679098be00d"} Apr 20 20:05:39.553949 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.553922 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ssfnj" event={"ID":"86459264-fd91-425e-8338-70b56d469a74","Type":"ContainerStarted","Data":"1b9328a11518556ad9898d9509f15a006c336b952be18f54df91184c5e73c4c1"} Apr 20 20:05:39.555132 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.555113 2574 generic.go:358] "Generic (PLEG): container finished" podID="d62d535b-7b78-4f80-8451-fabdfce754d7" containerID="3b3d5f6405bd332937dc48ad2ac7dac2ed5f4f904d59d54e41547368b8e3b6df" exitCode=0 Apr 20 20:05:39.555214 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.555167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerDied","Data":"3b3d5f6405bd332937dc48ad2ac7dac2ed5f4f904d59d54e41547368b8e3b6df"} Apr 20 20:05:39.556455 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.556413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" event={"ID":"e19165a1-00ed-47ec-bfb7-f7b723ee12ac","Type":"ContainerStarted","Data":"5b9bcf610d4c7ba6fbd24ec44b996968410f411eab42456ab86f2b214a1b514c"} Apr 20 20:05:39.557635 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.557616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j65bj" event={"ID":"0ce4f429-5df3-4576-bea7-50ab8358d9f7","Type":"ContainerStarted","Data":"20ed3d49d849d80b3fc9f52eb8a4d49f63ec993c212a20514d557164526a6e9e"} Apr 20 20:05:39.564612 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.564575 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-234.ec2.internal" podStartSLOduration=20.564565301000002 podStartE2EDuration="20.564565301s" podCreationTimestamp="2026-04-20 20:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:23.5375098 +0000 UTC m=+5.661202892" watchObservedRunningTime="2026-04-20 20:05:39.564565301 +0000 UTC m=+21.688258388" Apr 20 20:05:39.564686 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.564636 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4wcrc" podStartSLOduration=4.366409508 podStartE2EDuration="21.564633837s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.232375332 +0000 UTC m=+3.356068400" lastFinishedPulling="2026-04-20 20:05:38.430599658 +0000 UTC m=+20.554292729" observedRunningTime="2026-04-20 20:05:39.564073063 +0000 UTC m=+21.687766152" watchObservedRunningTime="2026-04-20 20:05:39.564633837 +0000 UTC m=+21.688326921" Apr 20 20:05:39.577329 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.577295 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d4mxm" podStartSLOduration=4.401390296 podStartE2EDuration="21.57728536s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.235580224 +0000 UTC m=+3.359273304" lastFinishedPulling="2026-04-20 20:05:38.411475288 +0000 UTC m=+20.535168368" observedRunningTime="2026-04-20 20:05:39.576965989 +0000 UTC m=+21.700659076" watchObservedRunningTime="2026-04-20 20:05:39.57728536 +0000 UTC m=+21.700978448" Apr 20 20:05:39.598837 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.598797 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-j65bj" podStartSLOduration=4.420441452 podStartE2EDuration="21.598786429s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.233154371 +0000 UTC m=+3.356847443" lastFinishedPulling="2026-04-20 20:05:38.411499333 +0000 UTC m=+20.535192420" observedRunningTime="2026-04-20 20:05:39.598777195 +0000 UTC m=+21.722470282" watchObservedRunningTime="2026-04-20 20:05:39.598786429 +0000 UTC m=+21.722479518" Apr 20 20:05:39.658235 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.658202 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ssfnj" podStartSLOduration=9.076101701 podStartE2EDuration="21.658191257s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.228976404 +0000 UTC m=+3.352669471" lastFinishedPulling="2026-04-20 20:05:33.811065947 +0000 UTC m=+15.934759027" observedRunningTime="2026-04-20 20:05:39.658116707 +0000 UTC m=+21.781809794" watchObservedRunningTime="2026-04-20 20:05:39.658191257 +0000 UTC m=+21.781884345" Apr 20 20:05:39.658432 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.658414 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q6qs6" podStartSLOduration=4.569356796 podStartE2EDuration="21.658409517s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.230571945 +0000 UTC m=+3.354265011" lastFinishedPulling="2026-04-20 20:05:38.319624665 +0000 UTC m=+20.443317732" observedRunningTime="2026-04-20 20:05:39.640606947 +0000 UTC m=+21.764300034" watchObservedRunningTime="2026-04-20 20:05:39.658409517 +0000 UTC m=+21.782102605" Apr 20 20:05:39.810258 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:39.810229 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:05:40.394171 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:40.394003 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:05:39.810250141Z","UUID":"be4e479f-3eaa-4732-8e99-2c1f3f9afdae","Handler":null,"Name":"","Endpoint":""} Apr 20 20:05:40.396994 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:40.396140 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:05:40.396994 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:40.396172 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:05:40.461521 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:40.461486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:40.461654 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:40.461608 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:40.461891 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:40.461493 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:40.462016 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:40.461949 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:40.561980 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:40.561919 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" event={"ID":"e19165a1-00ed-47ec-bfb7-f7b723ee12ac","Type":"ContainerStarted","Data":"7cd05fd094dde567a708c6d8d723b79263754ef47fda94f634da3ab8d7429f26"} Apr 20 20:05:40.564624 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:40.564596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-d4qvq" event={"ID":"bf1e798d-74ef-4682-bd04-15da759fea59","Type":"ContainerStarted","Data":"d0bd14223a2657d23f109226cfaea4f25b7db7061003f7fb8e26c9b65b40ffe7"} Apr 20 20:05:40.580853 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:40.580809 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-d4qvq" podStartSLOduration=5.492345191 podStartE2EDuration="22.580796299s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.23110553 +0000 UTC m=+3.354798597" lastFinishedPulling="2026-04-20 20:05:38.319556624 +0000 UTC m=+20.443249705" observedRunningTime="2026-04-20 20:05:40.580534331 +0000 UTC m=+22.704227420" watchObservedRunningTime="2026-04-20 20:05:40.580796299 +0000 UTC m=+22.704489387" Apr 20 20:05:41.567828 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:41.567741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" event={"ID":"e19165a1-00ed-47ec-bfb7-f7b723ee12ac","Type":"ContainerStarted","Data":"f57b0a7e24dc0d0be37a6034432ed34384245069e3835445b7f3a7cfc6832e36"} Apr 20 20:05:41.570547 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:41.570514 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:05:41.570949 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:41.570922 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"a4309aafcb4ec2ac0a185c78c043e85010c8032030b9d8d242c2b78e7ed543c6"} Apr 20 20:05:41.586226 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:41.586171 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c9rks" podStartSLOduration=3.775111252 podStartE2EDuration="23.586157495s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.235578804 +0000 UTC m=+3.359271874" lastFinishedPulling="2026-04-20 20:05:41.046625047 +0000 UTC m=+23.170318117" observedRunningTime="2026-04-20 20:05:41.585839589 +0000 UTC m=+23.709532683" watchObservedRunningTime="2026-04-20 20:05:41.586157495 +0000 UTC m=+23.709850582" Apr 20 20:05:42.461339 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:42.461311 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:42.461572 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:42.461439 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:42.461572 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:42.461503 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:42.461681 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:42.461604 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:43.195739 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:43.195704 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:43.196583 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:43.196562 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:44.461383 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.461135 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:44.461779 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.461158 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:44.461779 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:44.461483 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:44.461779 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:44.461529 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:44.577547 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.577512 2574 generic.go:358] "Generic (PLEG): container finished" podID="d62d535b-7b78-4f80-8451-fabdfce754d7" containerID="82101b4f25346d483e5e13818560695b857e017ecc6c9e5a288dc83e15ca4e83" exitCode=0 Apr 20 20:05:44.577680 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.577549 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerDied","Data":"82101b4f25346d483e5e13818560695b857e017ecc6c9e5a288dc83e15ca4e83"} Apr 20 20:05:44.580911 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.580891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:05:44.581263 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.581243 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"3c9da8f6310b583a4871d3179738e61a17725185a1d558ec6a170511508bd59a"} Apr 20 20:05:44.581544 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.581525 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:44.581605 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.581555 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:44.581605 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.581568 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:44.581695 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.581651 2574 scope.go:117] "RemoveContainer" containerID="09e66a1f5119a15e2d1a3853f803b1157d777c92a1040f2c3b6704faf57fd983" Apr 20 20:05:44.599053 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.599008 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:44.599612 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:44.599594 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:05:45.585093 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.584837 2574 generic.go:358] "Generic (PLEG): container finished" podID="d62d535b-7b78-4f80-8451-fabdfce754d7" containerID="49bb4c8ce02c14296177ce24d8f638f953929f25936d0300401c260ef3f78640" exitCode=0 Apr 20 20:05:45.585093 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.584930 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerDied","Data":"49bb4c8ce02c14296177ce24d8f638f953929f25936d0300401c260ef3f78640"} Apr 20 20:05:45.588536 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.588517 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:05:45.588844 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.588824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" event={"ID":"eff2e20b-0b3f-4623-b2e9-68404cf5689f","Type":"ContainerStarted","Data":"ce5e4ab084548f1e606e188b7181e4cce9b8d661b65b5773c829b0f053a97623"} Apr 20 20:05:45.655421 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.655392 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:45.655583 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.655519 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:05:45.655965 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.655947 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q6qs6" Apr 20 20:05:45.692755 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.692725 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wktd8"] Apr 20 20:05:45.692903 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.692844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:45.692954 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:45.692931 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:45.695676 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.695650 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5kgnf"] Apr 20 20:05:45.695780 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.695761 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:45.695867 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:45.695845 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:45.714778 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:45.714739 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" podStartSLOduration=10.443261615 podStartE2EDuration="27.714727237s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.223003657 +0000 UTC m=+3.346696736" lastFinishedPulling="2026-04-20 20:05:38.494469287 +0000 UTC m=+20.618162358" observedRunningTime="2026-04-20 20:05:45.714496051 +0000 UTC m=+27.838189139" watchObservedRunningTime="2026-04-20 20:05:45.714727237 +0000 UTC m=+27.838420325" Apr 20 20:05:46.592528 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:46.592499 2574 generic.go:358] "Generic (PLEG): container finished" podID="d62d535b-7b78-4f80-8451-fabdfce754d7" containerID="42240d9cd6ac583be53c731ac18ed6132f0f7dd3b446c8b07a1b31f5377d2bac" exitCode=0 Apr 20 20:05:46.593135 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:46.592578 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerDied","Data":"42240d9cd6ac583be53c731ac18ed6132f0f7dd3b446c8b07a1b31f5377d2bac"} Apr 20 20:05:47.461593 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:47.461552 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:47.461760 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:47.461715 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:47.461833 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:47.461768 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:47.461917 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:47.461884 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:49.460572 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:49.460540 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:49.460572 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:49.460571 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:49.461535 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:49.460648 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:49.461535 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:49.460987 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:51.461296 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.461073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:51.461746 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.461083 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:51.461746 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:51.461381 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5kgnf" podUID="655b7db6-852f-4d19-9975-31ad69976609" Apr 20 20:05:51.461746 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:51.461503 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:05:51.687655 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.687626 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-234.ec2.internal" event="NodeReady" Apr 20 20:05:51.687842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.687774 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:05:51.729581 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.729548 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bh6cg"] Apr 20 20:05:51.759767 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.759733 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mwhhz"] Apr 20 20:05:51.759933 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.759788 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.762353 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.762330 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:05:51.762930 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.762908 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c4d6d\"" Apr 20 20:05:51.763071 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.762997 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:05:51.778158 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.778136 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bh6cg"] Apr 20 20:05:51.778158 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.778163 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mwhhz"] Apr 20 20:05:51.778327 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.778260 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:51.780794 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.780526 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5cv42\"" Apr 20 20:05:51.780794 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.780543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:05:51.780794 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.780576 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:05:51.780794 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.780549 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:05:51.873366 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.873320 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:51.873366 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.873372 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6b65\" (UniqueName: \"kubernetes.io/projected/da8e59f9-df8a-4e18-98ec-09373ec8bee1-kube-api-access-r6b65\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:51.873598 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.873456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c420267-955c-479f-93c5-f3be116a6270-tmp-dir\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.873598 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.873501 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c420267-955c-479f-93c5-f3be116a6270-config-volume\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.873598 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.873594 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.873737 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.873626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt5x4\" (UniqueName: \"kubernetes.io/projected/1c420267-955c-479f-93c5-f3be116a6270-kube-api-access-pt5x4\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.974253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.974176 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c420267-955c-479f-93c5-f3be116a6270-config-volume\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.974253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.974230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.974253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.974248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt5x4\" (UniqueName: \"kubernetes.io/projected/1c420267-955c-479f-93c5-f3be116a6270-kube-api-access-pt5x4\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.974554 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.974274 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:51.974554 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.974293 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6b65\" (UniqueName: \"kubernetes.io/projected/da8e59f9-df8a-4e18-98ec-09373ec8bee1-kube-api-access-r6b65\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:51.974554 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.974325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c420267-955c-479f-93c5-f3be116a6270-tmp-dir\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.974554 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:51.974401 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:51.974554 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:51.974439 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:51.974554 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:51.974470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert podName:da8e59f9-df8a-4e18-98ec-09373ec8bee1 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:52.474454131 +0000 UTC m=+34.598147198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert") pod "ingress-canary-mwhhz" (UID: "da8e59f9-df8a-4e18-98ec-09373ec8bee1") : secret "canary-serving-cert" not found Apr 20 20:05:51.974554 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:51.974506 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls podName:1c420267-955c-479f-93c5-f3be116a6270 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:52.474488535 +0000 UTC m=+34.598181604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls") pod "dns-default-bh6cg" (UID: "1c420267-955c-479f-93c5-f3be116a6270") : secret "dns-default-metrics-tls" not found Apr 20 20:05:51.974810 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.974624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c420267-955c-479f-93c5-f3be116a6270-tmp-dir\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.974855 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.974815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c420267-955c-479f-93c5-f3be116a6270-config-volume\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:51.984266 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.984246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6b65\" (UniqueName: \"kubernetes.io/projected/da8e59f9-df8a-4e18-98ec-09373ec8bee1-kube-api-access-r6b65\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:51.984266 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:51.984259 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt5x4\" (UniqueName: \"kubernetes.io/projected/1c420267-955c-479f-93c5-f3be116a6270-kube-api-access-pt5x4\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:52.075608 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:52.075582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:52.075733 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.075694 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:52.075769 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.075743 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:24.075729456 +0000 UTC m=+66.199422521 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:52.176795 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:52.176768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:52.176942 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.176914 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:52.176983 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.176949 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:52.176983 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.176959 2574 projected.go:194] Error preparing data for projected volume kube-api-access-lpc62 for pod openshift-network-diagnostics/network-check-target-5kgnf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:52.177076 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.177003 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62 podName:655b7db6-852f-4d19-9975-31ad69976609 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:24.17699067 +0000 UTC m=+66.300683736 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-lpc62" (UniqueName: "kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62") pod "network-check-target-5kgnf" (UID: "655b7db6-852f-4d19-9975-31ad69976609") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:52.479217 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:52.479190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:52.479617 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:52.479234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:52.479617 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.479383 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:52.479617 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.479454 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls podName:1c420267-955c-479f-93c5-f3be116a6270 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:53.479437439 +0000 UTC m=+35.603130506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls") pod "dns-default-bh6cg" (UID: "1c420267-955c-479f-93c5-f3be116a6270") : secret "dns-default-metrics-tls" not found Apr 20 20:05:52.479617 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.479462 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:52.479617 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:52.479506 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert podName:da8e59f9-df8a-4e18-98ec-09373ec8bee1 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:53.479490287 +0000 UTC m=+35.603183368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert") pod "ingress-canary-mwhhz" (UID: "da8e59f9-df8a-4e18-98ec-09373ec8bee1") : secret "canary-serving-cert" not found Apr 20 20:05:52.606624 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:52.606582 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerStarted","Data":"654ff1aeadaeb99b20a638c05ffa3d9ec996772a4b95a04fce28b53e76b885d5"} Apr 20 20:05:53.460991 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.460954 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:05:53.461180 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.460954 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:05:53.463738 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.463702 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:05:53.463811 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.463709 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt5w7\"" Apr 20 20:05:53.463811 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.463711 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:05:53.463811 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.463785 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:05:53.464595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.464581 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lvnms\"" Apr 20 20:05:53.485739 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.485710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:53.486054 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.485751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:53.486054 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:53.485845 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:53.486054 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:53.485851 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:53.486054 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:53.485903 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls podName:1c420267-955c-479f-93c5-f3be116a6270 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:55.485887198 +0000 UTC m=+37.609580264 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls") pod "dns-default-bh6cg" (UID: "1c420267-955c-479f-93c5-f3be116a6270") : secret "dns-default-metrics-tls" not found Apr 20 20:05:53.486054 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:53.485918 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert podName:da8e59f9-df8a-4e18-98ec-09373ec8bee1 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:55.485910259 +0000 UTC m=+37.609603325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert") pod "ingress-canary-mwhhz" (UID: "da8e59f9-df8a-4e18-98ec-09373ec8bee1") : secret "canary-serving-cert" not found Apr 20 20:05:53.611076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.611030 2574 generic.go:358] "Generic (PLEG): container finished" podID="d62d535b-7b78-4f80-8451-fabdfce754d7" containerID="654ff1aeadaeb99b20a638c05ffa3d9ec996772a4b95a04fce28b53e76b885d5" exitCode=0 Apr 20 20:05:53.611076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:53.611067 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerDied","Data":"654ff1aeadaeb99b20a638c05ffa3d9ec996772a4b95a04fce28b53e76b885d5"} Apr 20 20:05:54.615570 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:54.615541 2574 generic.go:358] "Generic (PLEG): container finished" podID="d62d535b-7b78-4f80-8451-fabdfce754d7" containerID="b0edd7c31f6f4005df151a33f2b22f2a7ab25c826b9a100f815f31eec6b73098" exitCode=0 Apr 20 20:05:54.615570 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:54.615576 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerDied","Data":"b0edd7c31f6f4005df151a33f2b22f2a7ab25c826b9a100f815f31eec6b73098"} Apr 20 20:05:55.500381 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:55.500301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:55.500519 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:55.500382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:55.500519 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:55.500452 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:55.500519 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:55.500463 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:55.500519 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:55.500515 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert podName:da8e59f9-df8a-4e18-98ec-09373ec8bee1 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.500498983 +0000 UTC m=+41.624192048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert") pod "ingress-canary-mwhhz" (UID: "da8e59f9-df8a-4e18-98ec-09373ec8bee1") : secret "canary-serving-cert" not found Apr 20 20:05:55.500653 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:55.500546 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls podName:1c420267-955c-479f-93c5-f3be116a6270 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.500522757 +0000 UTC m=+41.624215822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls") pod "dns-default-bh6cg" (UID: "1c420267-955c-479f-93c5-f3be116a6270") : secret "dns-default-metrics-tls" not found Apr 20 20:05:55.620320 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:55.620291 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" event={"ID":"d62d535b-7b78-4f80-8451-fabdfce754d7","Type":"ContainerStarted","Data":"32c27b44a187293168cf36285822d9950069761e4b241edc54be3c2086a21c85"} Apr 20 20:05:55.642490 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:55.642443 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lcb2v" podStartSLOduration=6.441510289 podStartE2EDuration="37.642428616s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:21.218907042 +0000 UTC m=+3.342600108" lastFinishedPulling="2026-04-20 20:05:52.419825354 +0000 UTC m=+34.543518435" observedRunningTime="2026-04-20 20:05:55.641310896 +0000 UTC m=+37.765003975" watchObservedRunningTime="2026-04-20 20:05:55.642428616 +0000 UTC m=+37.766121704" Apr 20 20:05:59.525836 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:59.525803 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:05:59.526307 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:05:59.525846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:05:59.526307 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:59.525932 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:59.526307 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:59.525934 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:59.526307 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:59.525980 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls podName:1c420267-955c-479f-93c5-f3be116a6270 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:07.525967552 +0000 UTC m=+49.649660618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls") pod "dns-default-bh6cg" (UID: "1c420267-955c-479f-93c5-f3be116a6270") : secret "dns-default-metrics-tls" not found Apr 20 20:05:59.526307 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:05:59.525994 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert podName:da8e59f9-df8a-4e18-98ec-09373ec8bee1 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:07.525988047 +0000 UTC m=+49.649681113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert") pod "ingress-canary-mwhhz" (UID: "da8e59f9-df8a-4e18-98ec-09373ec8bee1") : secret "canary-serving-cert" not found Apr 20 20:06:07.580383 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:07.580338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:06:07.580829 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:07.580391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:06:07.580829 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:07.580498 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:07.580829 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:07.580498 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:07.580829 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:07.580575 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert podName:da8e59f9-df8a-4e18-98ec-09373ec8bee1 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:23.580560247 +0000 UTC m=+65.704253312 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert") pod "ingress-canary-mwhhz" (UID: "da8e59f9-df8a-4e18-98ec-09373ec8bee1") : secret "canary-serving-cert" not found Apr 20 20:06:07.580829 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:07.580589 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls podName:1c420267-955c-479f-93c5-f3be116a6270 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:23.580583681 +0000 UTC m=+65.704276747 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls") pod "dns-default-bh6cg" (UID: "1c420267-955c-479f-93c5-f3be116a6270") : secret "dns-default-metrics-tls" not found Apr 20 20:06:16.604133 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:16.604100 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6wvn" Apr 20 20:06:23.589482 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:23.589440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:06:23.589482 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:23.589491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:06:23.590061 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:23.589595 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:23.590061 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:23.589616 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:23.590061 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:23.589655 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert podName:da8e59f9-df8a-4e18-98ec-09373ec8bee1 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:55.589642826 +0000 UTC m=+97.713335892 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert") pod "ingress-canary-mwhhz" (UID: "da8e59f9-df8a-4e18-98ec-09373ec8bee1") : secret "canary-serving-cert" not found Apr 20 20:06:23.590061 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:23.589693 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls podName:1c420267-955c-479f-93c5-f3be116a6270 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:55.589674457 +0000 UTC m=+97.713367534 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls") pod "dns-default-bh6cg" (UID: "1c420267-955c-479f-93c5-f3be116a6270") : secret "dns-default-metrics-tls" not found Apr 20 20:06:24.093271 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.093227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:06:24.095825 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.095805 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:24.103845 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:24.103824 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:06:24.103908 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:24.103897 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:28.103875596 +0000 UTC m=+130.227568665 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : secret "metrics-daemon-secret" not found Apr 20 20:06:24.193786 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.193754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:06:24.196810 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.196793 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:24.207389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.207372 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:24.218816 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.218785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpc62\" (UniqueName: \"kubernetes.io/projected/655b7db6-852f-4d19-9975-31ad69976609-kube-api-access-lpc62\") pod \"network-check-target-5kgnf\" (UID: \"655b7db6-852f-4d19-9975-31ad69976609\") " pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:06:24.372522 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.372452 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lvnms\"" Apr 20 20:06:24.381242 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.381221 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:06:24.517916 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.517884 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5kgnf"] Apr 20 20:06:24.521268 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:06:24.521238 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod655b7db6_852f_4d19_9975_31ad69976609.slice/crio-b48f9136f360871bf0204ba261552b8f89d687ef874b19b1fc51dcd9a3ca6949 WatchSource:0}: Error finding container b48f9136f360871bf0204ba261552b8f89d687ef874b19b1fc51dcd9a3ca6949: Status 404 returned error can't find the container with id b48f9136f360871bf0204ba261552b8f89d687ef874b19b1fc51dcd9a3ca6949 Apr 20 20:06:24.674826 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:24.674739 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5kgnf" event={"ID":"655b7db6-852f-4d19-9975-31ad69976609","Type":"ContainerStarted","Data":"b48f9136f360871bf0204ba261552b8f89d687ef874b19b1fc51dcd9a3ca6949"} Apr 20 20:06:27.681817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:27.681778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5kgnf" event={"ID":"655b7db6-852f-4d19-9975-31ad69976609","Type":"ContainerStarted","Data":"0fc52c4f6443e388a41786081d25ff327c69025ff40dbc0157cd8d21eaee7bb6"} Apr 20 20:06:27.682229 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:27.681940 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:06:27.696006 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:27.695937 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5kgnf" podStartSLOduration=66.681410299 podStartE2EDuration="1m9.695920934s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:06:24.523217144 +0000 UTC m=+66.646910211" lastFinishedPulling="2026-04-20 20:06:27.53772778 +0000 UTC m=+69.661420846" observedRunningTime="2026-04-20 20:06:27.695372328 +0000 UTC m=+69.819065417" watchObservedRunningTime="2026-04-20 20:06:27.695920934 +0000 UTC m=+69.819614023" Apr 20 20:06:55.595715 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:55.595673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:06:55.596198 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:55.595756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:06:55.596198 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:55.595826 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:55.596198 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:55.595861 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:55.596198 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:55.595890 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert podName:da8e59f9-df8a-4e18-98ec-09373ec8bee1 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:59.595874929 +0000 UTC m=+161.719567995 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert") pod "ingress-canary-mwhhz" (UID: "da8e59f9-df8a-4e18-98ec-09373ec8bee1") : secret "canary-serving-cert" not found Apr 20 20:06:55.596198 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:06:55.595924 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls podName:1c420267-955c-479f-93c5-f3be116a6270 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:59.595908476 +0000 UTC m=+161.719601543 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls") pod "dns-default-bh6cg" (UID: "1c420267-955c-479f-93c5-f3be116a6270") : secret "dns-default-metrics-tls" not found Apr 20 20:06:58.686008 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:06:58.685978 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5kgnf" Apr 20 20:07:13.064613 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.064578 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm"] Apr 20 20:07:13.069437 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.069414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.072157 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.072133 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-74t9l"] Apr 20 20:07:13.073968 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.073944 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 20:07:13.074106 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.073988 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:07:13.074106 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.073999 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:07:13.074106 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.074018 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 20:07:13.074257 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.074098 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xmmbb\"" Apr 20 20:07:13.075278 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.075256 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.077277 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.077259 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-96kt2\"" Apr 20 20:07:13.077374 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.077308 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:07:13.077374 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.077309 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 20:07:13.077684 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.077669 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:07:13.077684 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.077680 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 20:07:13.080936 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.080912 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm"] Apr 20 20:07:13.083731 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.083703 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-74t9l"] Apr 20 20:07:13.084404 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.084382 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 20:07:13.107874 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.107843 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld79x\" (UniqueName: \"kubernetes.io/projected/78447207-5d22-46b9-9ad3-a68cd998c91a-kube-api-access-ld79x\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.107874 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.107876 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78447207-5d22-46b9-9ad3-a68cd998c91a-serving-cert\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.108062 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.107914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.108062 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.107957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78447207-5d22-46b9-9ad3-a68cd998c91a-tmp\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.108062 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.107994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447207-5d22-46b9-9ad3-a68cd998c91a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.108162 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.108101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/78447207-5d22-46b9-9ad3-a68cd998c91a-snapshots\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.108162 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.108132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxcjj\" (UniqueName: \"kubernetes.io/projected/5999adb7-d895-4660-8bf8-546e2d8dd27c-kube-api-access-pxcjj\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.108223 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.108166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447207-5d22-46b9-9ad3-a68cd998c91a-service-ca-bundle\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.108223 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.108202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5999adb7-d895-4660-8bf8-546e2d8dd27c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.208981 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.208949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78447207-5d22-46b9-9ad3-a68cd998c91a-serving-cert\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.209169 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.208994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.209169 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78447207-5d22-46b9-9ad3-a68cd998c91a-tmp\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.209169 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447207-5d22-46b9-9ad3-a68cd998c91a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.209169 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/78447207-5d22-46b9-9ad3-a68cd998c91a-snapshots\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.209169 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.209128 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:13.209422 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.209201 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls podName:5999adb7-d895-4660-8bf8-546e2d8dd27c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:13.709182579 +0000 UTC m=+115.832875660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fkttm" (UID: "5999adb7-d895-4660-8bf8-546e2d8dd27c") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:13.209422 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxcjj\" (UniqueName: \"kubernetes.io/projected/5999adb7-d895-4660-8bf8-546e2d8dd27c-kube-api-access-pxcjj\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.209422 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209286 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447207-5d22-46b9-9ad3-a68cd998c91a-service-ca-bundle\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.209422 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5999adb7-d895-4660-8bf8-546e2d8dd27c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.209422 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld79x\" (UniqueName: \"kubernetes.io/projected/78447207-5d22-46b9-9ad3-a68cd998c91a-kube-api-access-ld79x\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.209675 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/78447207-5d22-46b9-9ad3-a68cd998c91a-tmp\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.209848 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.209821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/78447207-5d22-46b9-9ad3-a68cd998c91a-snapshots\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.210084 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.210062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447207-5d22-46b9-9ad3-a68cd998c91a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.210158 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.210130 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5999adb7-d895-4660-8bf8-546e2d8dd27c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.210158 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.210145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447207-5d22-46b9-9ad3-a68cd998c91a-service-ca-bundle\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.211501 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.211481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78447207-5d22-46b9-9ad3-a68cd998c91a-serving-cert\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.217642 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.217614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxcjj\" (UniqueName: \"kubernetes.io/projected/5999adb7-d895-4660-8bf8-546e2d8dd27c-kube-api-access-pxcjj\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.217728 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.217642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld79x\" (UniqueName: \"kubernetes.io/projected/78447207-5d22-46b9-9ad3-a68cd998c91a-kube-api-access-ld79x\") pod \"insights-operator-585dfdc468-74t9l\" (UID: \"78447207-5d22-46b9-9ad3-a68cd998c91a\") " pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.280732 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.280706 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d49c56f64-txz9l"] Apr 20 20:07:13.285429 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.285414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.287646 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.287623 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:07:13.287646 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.287642 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j7xv8\"" Apr 20 20:07:13.287808 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.287716 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:07:13.287808 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.287791 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:07:13.292280 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.292260 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:07:13.294974 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.294956 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d49c56f64-txz9l"] Apr 20 20:07:13.310159 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.310134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-trusted-ca\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.310253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.310164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-image-registry-private-configuration\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.310253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.310185 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-bound-sa-token\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.310253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.310216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwxzs\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-kube-api-access-pwxzs\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.310353 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.310285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.310353 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.310318 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-certificates\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.310413 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.310362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-installation-pull-secrets\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.310448 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.310424 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b6e5b82-3387-4fed-b751-81a011f3b96b-ca-trust-extracted\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.383730 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.383673 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98"] Apr 20 20:07:13.386676 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.386660 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98" Apr 20 20:07:13.386755 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.386701 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-74t9l" Apr 20 20:07:13.386804 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.386781 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rxlpd"] Apr 20 20:07:13.389014 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.388989 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-9v9b7\"" Apr 20 20:07:13.389014 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.389010 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:13.389789 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.389771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.390858 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.390651 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:13.392361 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.392238 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 20:07:13.392361 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.392309 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-n9g7f\"" Apr 20 20:07:13.392526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.392395 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:13.392526 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.392395 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 20:07:13.392631 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.392541 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:13.395488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.395467 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98"] Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.412786 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413268 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-certificates\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1fb92b-6a4c-4495-93a7-29206e6b8642-serving-cert\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvhq\" (UniqueName: \"kubernetes.io/projected/b227ca2d-a313-4e30-ab0f-03bda0c2db1f-kube-api-access-xqvhq\") pod \"volume-data-source-validator-7c6cbb6c87-56p98\" (UID: \"b227ca2d-a313-4e30-ab0f-03bda0c2db1f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-installation-pull-secrets\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413448 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1fb92b-6a4c-4495-93a7-29206e6b8642-trusted-ca\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b6e5b82-3387-4fed-b751-81a011f3b96b-ca-trust-extracted\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-trusted-ca\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413544 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1fb92b-6a4c-4495-93a7-29206e6b8642-config\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-image-registry-private-configuration\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-bound-sa-token\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413652 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwxzs\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-kube-api-access-pwxzs\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.413677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dncmw\" (UniqueName: \"kubernetes.io/projected/ee1fb92b-6a4c-4495-93a7-29206e6b8642-kube-api-access-dncmw\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.413819 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:13.423232 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.413833 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d49c56f64-txz9l: secret "image-registry-tls" not found Apr 20 20:07:13.424080 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.413891 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls podName:1b6e5b82-3387-4fed-b751-81a011f3b96b nodeName:}" failed. No retries permitted until 2026-04-20 20:07:13.913872797 +0000 UTC m=+116.037565867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls") pod "image-registry-5d49c56f64-txz9l" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b") : secret "image-registry-tls" not found Apr 20 20:07:13.424080 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.417943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b6e5b82-3387-4fed-b751-81a011f3b96b-ca-trust-extracted\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.424080 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.419479 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-trusted-ca\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.427205 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.426939 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rxlpd"] Apr 20 20:07:13.427442 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.427421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-certificates\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.429697 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.429646 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-image-registry-private-configuration\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.433278 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.432586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwxzs\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-kube-api-access-pwxzs\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.434720 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.434689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-bound-sa-token\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.435368 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.435334 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-installation-pull-secrets\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.514990 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.514959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1fb92b-6a4c-4495-93a7-29206e6b8642-serving-cert\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.515159 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.515005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvhq\" (UniqueName: \"kubernetes.io/projected/b227ca2d-a313-4e30-ab0f-03bda0c2db1f-kube-api-access-xqvhq\") pod \"volume-data-source-validator-7c6cbb6c87-56p98\" (UID: \"b227ca2d-a313-4e30-ab0f-03bda0c2db1f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98" Apr 20 20:07:13.515159 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.515104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1fb92b-6a4c-4495-93a7-29206e6b8642-trusted-ca\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.515159 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.515144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1fb92b-6a4c-4495-93a7-29206e6b8642-config\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.515304 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.515183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dncmw\" (UniqueName: \"kubernetes.io/projected/ee1fb92b-6a4c-4495-93a7-29206e6b8642-kube-api-access-dncmw\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.515820 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.515803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1fb92b-6a4c-4495-93a7-29206e6b8642-config\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.516171 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.516154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1fb92b-6a4c-4495-93a7-29206e6b8642-trusted-ca\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.517338 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.517320 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1fb92b-6a4c-4495-93a7-29206e6b8642-serving-cert\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.523425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.523400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dncmw\" (UniqueName: \"kubernetes.io/projected/ee1fb92b-6a4c-4495-93a7-29206e6b8642-kube-api-access-dncmw\") pod \"console-operator-9d4b6777b-rxlpd\" (UID: \"ee1fb92b-6a4c-4495-93a7-29206e6b8642\") " pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.523547 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.523531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvhq\" (UniqueName: \"kubernetes.io/projected/b227ca2d-a313-4e30-ab0f-03bda0c2db1f-kube-api-access-xqvhq\") pod \"volume-data-source-validator-7c6cbb6c87-56p98\" (UID: \"b227ca2d-a313-4e30-ab0f-03bda0c2db1f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98" Apr 20 20:07:13.526293 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.526272 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-74t9l"] Apr 20 20:07:13.529212 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:13.529189 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78447207_5d22_46b9_9ad3_a68cd998c91a.slice/crio-2b47738c670ec016022f78ed44b474780b71bd9252c1a26cbfa7a5b72d495415 WatchSource:0}: Error finding container 2b47738c670ec016022f78ed44b474780b71bd9252c1a26cbfa7a5b72d495415: Status 404 returned error can't find the container with id 2b47738c670ec016022f78ed44b474780b71bd9252c1a26cbfa7a5b72d495415 Apr 20 20:07:13.716198 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.716120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:13.716325 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.716270 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:13.716361 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.716332 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls podName:5999adb7-d895-4660-8bf8-546e2d8dd27c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:14.716317874 +0000 UTC m=+116.840010940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fkttm" (UID: "5999adb7-d895-4660-8bf8-546e2d8dd27c") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:13.726793 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.726773 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98" Apr 20 20:07:13.735424 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.735407 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:13.774523 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.774475 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-74t9l" event={"ID":"78447207-5d22-46b9-9ad3-a68cd998c91a","Type":"ContainerStarted","Data":"2b47738c670ec016022f78ed44b474780b71bd9252c1a26cbfa7a5b72d495415"} Apr 20 20:07:13.854740 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.854708 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98"] Apr 20 20:07:13.857547 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:13.857518 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb227ca2d_a313_4e30_ab0f_03bda0c2db1f.slice/crio-13e19af5bb5c90ac9bfe3c1423e94daac728c3b3a5a1bc84ee01a09d54efc468 WatchSource:0}: Error finding container 13e19af5bb5c90ac9bfe3c1423e94daac728c3b3a5a1bc84ee01a09d54efc468: Status 404 returned error can't find the container with id 13e19af5bb5c90ac9bfe3c1423e94daac728c3b3a5a1bc84ee01a09d54efc468 Apr 20 20:07:13.872109 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.870519 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rxlpd"] Apr 20 20:07:13.873858 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:13.873831 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee1fb92b_6a4c_4495_93a7_29206e6b8642.slice/crio-d1e5e979b28b96c5dff573f275b95ceedd614ae094671cef4778a1e5692a98ab WatchSource:0}: Error finding container d1e5e979b28b96c5dff573f275b95ceedd614ae094671cef4778a1e5692a98ab: Status 404 returned error can't find the container with id d1e5e979b28b96c5dff573f275b95ceedd614ae094671cef4778a1e5692a98ab Apr 20 20:07:13.917120 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:13.917095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:13.917287 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.917268 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:13.917354 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.917291 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d49c56f64-txz9l: secret "image-registry-tls" not found Apr 20 20:07:13.917405 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:13.917367 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls podName:1b6e5b82-3387-4fed-b751-81a011f3b96b nodeName:}" failed. No retries permitted until 2026-04-20 20:07:14.917346091 +0000 UTC m=+117.041039178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls") pod "image-registry-5d49c56f64-txz9l" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b") : secret "image-registry-tls" not found Apr 20 20:07:14.724463 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:14.724331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:14.724905 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:14.724492 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:14.724905 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:14.724554 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls podName:5999adb7-d895-4660-8bf8-546e2d8dd27c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:16.724540098 +0000 UTC m=+118.848233166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fkttm" (UID: "5999adb7-d895-4660-8bf8-546e2d8dd27c") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:14.777974 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:14.777935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" event={"ID":"ee1fb92b-6a4c-4495-93a7-29206e6b8642","Type":"ContainerStarted","Data":"d1e5e979b28b96c5dff573f275b95ceedd614ae094671cef4778a1e5692a98ab"} Apr 20 20:07:14.779199 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:14.779160 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98" event={"ID":"b227ca2d-a313-4e30-ab0f-03bda0c2db1f","Type":"ContainerStarted","Data":"13e19af5bb5c90ac9bfe3c1423e94daac728c3b3a5a1bc84ee01a09d54efc468"} Apr 20 20:07:14.926110 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:14.926071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:14.926281 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:14.926258 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:14.926281 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:14.926281 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d49c56f64-txz9l: secret "image-registry-tls" not found Apr 20 20:07:14.926395 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:14.926344 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls podName:1b6e5b82-3387-4fed-b751-81a011f3b96b nodeName:}" failed. No retries permitted until 2026-04-20 20:07:16.926327455 +0000 UTC m=+119.050020522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls") pod "image-registry-5d49c56f64-txz9l" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b") : secret "image-registry-tls" not found Apr 20 20:07:16.742351 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.742259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:16.742686 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:16.742402 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:16.742686 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:16.742466 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls podName:5999adb7-d895-4660-8bf8-546e2d8dd27c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:20.742451897 +0000 UTC m=+122.866144963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fkttm" (UID: "5999adb7-d895-4660-8bf8-546e2d8dd27c") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:16.783856 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.783815 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-74t9l" event={"ID":"78447207-5d22-46b9-9ad3-a68cd998c91a","Type":"ContainerStarted","Data":"eeb3bc1d024f5bb628b154ccec760d817cd4bd73f35cbed28bb927dd33734da7"} Apr 20 20:07:16.785048 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.785014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98" event={"ID":"b227ca2d-a313-4e30-ab0f-03bda0c2db1f","Type":"ContainerStarted","Data":"d9a1104fc21b389332dcaee679cb9f637d7b0b7c67d985d6135f78892523cdff"} Apr 20 20:07:16.786245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.786229 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/0.log" Apr 20 20:07:16.786335 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.786261 2574 generic.go:358] "Generic (PLEG): container finished" podID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" containerID="f00d2b589d2975416cf169b36accf1646d535bb0eda4e86ce90f3b9ed2dc25fc" exitCode=255 Apr 20 20:07:16.786335 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.786288 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" event={"ID":"ee1fb92b-6a4c-4495-93a7-29206e6b8642","Type":"ContainerDied","Data":"f00d2b589d2975416cf169b36accf1646d535bb0eda4e86ce90f3b9ed2dc25fc"} Apr 20 20:07:16.786559 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.786523 2574 scope.go:117] "RemoveContainer" containerID="f00d2b589d2975416cf169b36accf1646d535bb0eda4e86ce90f3b9ed2dc25fc" Apr 20 20:07:16.800074 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.800011 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-74t9l" podStartSLOduration=1.037409646 podStartE2EDuration="3.799998098s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="2026-04-20 20:07:13.53090153 +0000 UTC m=+115.654594596" lastFinishedPulling="2026-04-20 20:07:16.29348998 +0000 UTC m=+118.417183048" observedRunningTime="2026-04-20 20:07:16.799231113 +0000 UTC m=+118.922924205" watchObservedRunningTime="2026-04-20 20:07:16.799998098 +0000 UTC m=+118.923691187" Apr 20 20:07:16.828021 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.827971 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-56p98" podStartSLOduration=1.393672171 podStartE2EDuration="3.827952334s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="2026-04-20 20:07:13.859297627 +0000 UTC m=+115.982990693" lastFinishedPulling="2026-04-20 20:07:16.293577786 +0000 UTC m=+118.417270856" observedRunningTime="2026-04-20 20:07:16.827514689 +0000 UTC m=+118.951207779" watchObservedRunningTime="2026-04-20 20:07:16.827952334 +0000 UTC m=+118.951645425" Apr 20 20:07:16.943736 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:16.943698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:16.943864 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:16.943843 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:16.943906 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:16.943867 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d49c56f64-txz9l: secret "image-registry-tls" not found Apr 20 20:07:16.943940 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:16.943924 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls podName:1b6e5b82-3387-4fed-b751-81a011f3b96b nodeName:}" failed. No retries permitted until 2026-04-20 20:07:20.943907263 +0000 UTC m=+123.067600332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls") pod "image-registry-5d49c56f64-txz9l" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b") : secret "image-registry-tls" not found Apr 20 20:07:17.586607 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.586520 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v"] Apr 20 20:07:17.589600 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.589576 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" Apr 20 20:07:17.592021 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.592002 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 20:07:17.592719 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.592701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:17.592783 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.592718 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7ntb6\"" Apr 20 20:07:17.599078 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.599058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v"] Apr 20 20:07:17.649334 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.649305 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z269\" (UniqueName: \"kubernetes.io/projected/89657514-cc9b-40ee-80ca-4a2b6be50dc3-kube-api-access-6z269\") pod \"migrator-74bb7799d9-8d89v\" (UID: \"89657514-cc9b-40ee-80ca-4a2b6be50dc3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" Apr 20 20:07:17.749692 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.749660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z269\" (UniqueName: \"kubernetes.io/projected/89657514-cc9b-40ee-80ca-4a2b6be50dc3-kube-api-access-6z269\") pod \"migrator-74bb7799d9-8d89v\" (UID: \"89657514-cc9b-40ee-80ca-4a2b6be50dc3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" Apr 20 20:07:17.758387 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.758354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z269\" (UniqueName: \"kubernetes.io/projected/89657514-cc9b-40ee-80ca-4a2b6be50dc3-kube-api-access-6z269\") pod \"migrator-74bb7799d9-8d89v\" (UID: \"89657514-cc9b-40ee-80ca-4a2b6be50dc3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" Apr 20 20:07:17.794212 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.794189 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/1.log" Apr 20 20:07:17.794587 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.794572 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/0.log" Apr 20 20:07:17.794657 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.794606 2574 generic.go:358] "Generic (PLEG): container finished" podID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" containerID="826836e8d9c100bcbf666cfa79fe94ca280575807b0bc30ef39f12bb464df692" exitCode=255 Apr 20 20:07:17.794728 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.794702 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" event={"ID":"ee1fb92b-6a4c-4495-93a7-29206e6b8642","Type":"ContainerDied","Data":"826836e8d9c100bcbf666cfa79fe94ca280575807b0bc30ef39f12bb464df692"} Apr 20 20:07:17.794781 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.794753 2574 scope.go:117] "RemoveContainer" containerID="f00d2b589d2975416cf169b36accf1646d535bb0eda4e86ce90f3b9ed2dc25fc" Apr 20 20:07:17.794979 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.794963 2574 scope.go:117] "RemoveContainer" containerID="826836e8d9c100bcbf666cfa79fe94ca280575807b0bc30ef39f12bb464df692" Apr 20 20:07:17.795176 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:17.795157 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rxlpd_openshift-console-operator(ee1fb92b-6a4c-4495-93a7-29206e6b8642)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" podUID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" Apr 20 20:07:17.898160 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:17.898093 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" Apr 20 20:07:18.010904 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:18.010773 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v"] Apr 20 20:07:18.013225 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:18.013196 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89657514_cc9b_40ee_80ca_4a2b6be50dc3.slice/crio-de33e780d55354b85c9cc250a07a00f8ba61effbbeceef64a81badc4c6c07f32 WatchSource:0}: Error finding container de33e780d55354b85c9cc250a07a00f8ba61effbbeceef64a81badc4c6c07f32: Status 404 returned error can't find the container with id de33e780d55354b85c9cc250a07a00f8ba61effbbeceef64a81badc4c6c07f32 Apr 20 20:07:18.798483 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:18.798458 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/1.log" Apr 20 20:07:18.799064 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:18.798852 2574 scope.go:117] "RemoveContainer" containerID="826836e8d9c100bcbf666cfa79fe94ca280575807b0bc30ef39f12bb464df692" Apr 20 20:07:18.799152 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:18.799126 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rxlpd_openshift-console-operator(ee1fb92b-6a4c-4495-93a7-29206e6b8642)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" podUID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" Apr 20 20:07:18.799716 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:18.799689 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" event={"ID":"89657514-cc9b-40ee-80ca-4a2b6be50dc3","Type":"ContainerStarted","Data":"de33e780d55354b85c9cc250a07a00f8ba61effbbeceef64a81badc4c6c07f32"} Apr 20 20:07:19.020523 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:19.020489 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d4mxm_ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28/dns-node-resolver/0.log" Apr 20 20:07:19.807480 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:19.807439 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" event={"ID":"89657514-cc9b-40ee-80ca-4a2b6be50dc3","Type":"ContainerStarted","Data":"f3c6f419f5cc57c3619f78960b95be18ec70323eeae5cfd5e0a69d9afb247cb1"} Apr 20 20:07:19.807480 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:19.807483 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" event={"ID":"89657514-cc9b-40ee-80ca-4a2b6be50dc3","Type":"ContainerStarted","Data":"62dde31c9fbe0497c827c475dbed502b7cb2de89ca56693f8457ee08b495d9bb"} Apr 20 20:07:19.823665 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:19.823621 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8d89v" podStartSLOduration=1.664381418 podStartE2EDuration="2.823608496s" podCreationTimestamp="2026-04-20 20:07:17 +0000 UTC" firstStartedPulling="2026-04-20 20:07:18.015170936 +0000 UTC m=+120.138864002" lastFinishedPulling="2026-04-20 20:07:19.174398014 +0000 UTC m=+121.298091080" observedRunningTime="2026-04-20 20:07:19.822521035 +0000 UTC m=+121.946214134" watchObservedRunningTime="2026-04-20 20:07:19.823608496 +0000 UTC m=+121.947301584" Apr 20 20:07:20.408367 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.408340 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-df6jv"] Apr 20 20:07:20.411297 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.411281 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.414195 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.414166 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 20:07:20.414308 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.414204 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-q82kt\"" Apr 20 20:07:20.415053 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.415018 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 20:07:20.415138 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.415086 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 20:07:20.415138 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.415107 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 20:07:20.421415 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.421395 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-df6jv"] Apr 20 20:07:20.427423 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.427406 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ssfnj_86459264-fd91-425e-8338-70b56d469a74/node-ca/0.log" Apr 20 20:07:20.470116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.470091 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j5j2\" (UniqueName: \"kubernetes.io/projected/86474c1c-4590-4d77-a856-e6d9ef5228d4-kube-api-access-2j5j2\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.470228 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.470160 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/86474c1c-4590-4d77-a856-e6d9ef5228d4-signing-key\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.470286 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.470267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/86474c1c-4590-4d77-a856-e6d9ef5228d4-signing-cabundle\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.571388 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.571357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/86474c1c-4590-4d77-a856-e6d9ef5228d4-signing-key\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.571517 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.571414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/86474c1c-4590-4d77-a856-e6d9ef5228d4-signing-cabundle\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.571517 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.571489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j5j2\" (UniqueName: \"kubernetes.io/projected/86474c1c-4590-4d77-a856-e6d9ef5228d4-kube-api-access-2j5j2\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.572009 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.571990 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/86474c1c-4590-4d77-a856-e6d9ef5228d4-signing-cabundle\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.573979 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.573958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/86474c1c-4590-4d77-a856-e6d9ef5228d4-signing-key\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.579596 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.579572 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j5j2\" (UniqueName: \"kubernetes.io/projected/86474c1c-4590-4d77-a856-e6d9ef5228d4-kube-api-access-2j5j2\") pod \"service-ca-865cb79987-df6jv\" (UID: \"86474c1c-4590-4d77-a856-e6d9ef5228d4\") " pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.721014 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.720923 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-df6jv" Apr 20 20:07:20.773196 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.773166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:20.773374 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:20.773350 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:20.773452 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:20.773440 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls podName:5999adb7-d895-4660-8bf8-546e2d8dd27c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:28.773419553 +0000 UTC m=+130.897112631 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fkttm" (UID: "5999adb7-d895-4660-8bf8-546e2d8dd27c") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:20.837525 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.837495 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-df6jv"] Apr 20 20:07:20.841004 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:20.840980 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86474c1c_4590_4d77_a856_e6d9ef5228d4.slice/crio-9cc6d860fff7d84af01b3e7f661e0fe81c1d5347cc89e8d9dd811fb6c8722e23 WatchSource:0}: Error finding container 9cc6d860fff7d84af01b3e7f661e0fe81c1d5347cc89e8d9dd811fb6c8722e23: Status 404 returned error can't find the container with id 9cc6d860fff7d84af01b3e7f661e0fe81c1d5347cc89e8d9dd811fb6c8722e23 Apr 20 20:07:20.975144 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:20.975064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:20.975273 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:20.975189 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:20.975273 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:20.975206 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d49c56f64-txz9l: secret "image-registry-tls" not found Apr 20 20:07:20.975273 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:20.975256 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls podName:1b6e5b82-3387-4fed-b751-81a011f3b96b nodeName:}" failed. No retries permitted until 2026-04-20 20:07:28.975240634 +0000 UTC m=+131.098933699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls") pod "image-registry-5d49c56f64-txz9l" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b") : secret "image-registry-tls" not found Apr 20 20:07:21.815969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:21.815929 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-df6jv" event={"ID":"86474c1c-4590-4d77-a856-e6d9ef5228d4","Type":"ContainerStarted","Data":"9cc6d860fff7d84af01b3e7f661e0fe81c1d5347cc89e8d9dd811fb6c8722e23"} Apr 20 20:07:22.819054 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:22.819002 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-df6jv" event={"ID":"86474c1c-4590-4d77-a856-e6d9ef5228d4","Type":"ContainerStarted","Data":"afc93fadaf49411222af626780fcad11b1d67b0cb3570fb6635fd9c676dd3a3c"} Apr 20 20:07:22.835571 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:22.835524 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-df6jv" podStartSLOduration=1.193768225 podStartE2EDuration="2.835510288s" podCreationTimestamp="2026-04-20 20:07:20 +0000 UTC" firstStartedPulling="2026-04-20 20:07:20.842749646 +0000 UTC m=+122.966442712" lastFinishedPulling="2026-04-20 20:07:22.484491691 +0000 UTC m=+124.608184775" observedRunningTime="2026-04-20 20:07:22.834544189 +0000 UTC m=+124.958237277" watchObservedRunningTime="2026-04-20 20:07:22.835510288 +0000 UTC m=+124.959203376" Apr 20 20:07:23.736455 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:23.736421 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:23.736455 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:23.736460 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:23.736855 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:23.736843 2574 scope.go:117] "RemoveContainer" containerID="826836e8d9c100bcbf666cfa79fe94ca280575807b0bc30ef39f12bb464df692" Apr 20 20:07:23.737019 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:23.737002 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rxlpd_openshift-console-operator(ee1fb92b-6a4c-4495-93a7-29206e6b8642)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" podUID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" Apr 20 20:07:28.136597 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:28.136557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:07:28.136983 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:28.136685 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:28.136983 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:28.136740 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs podName:184c92c6-a188-47c2-acbf-e9fe477d6c13 nodeName:}" failed. No retries permitted until 2026-04-20 20:09:30.136726286 +0000 UTC m=+252.260419352 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs") pod "network-metrics-daemon-wktd8" (UID: "184c92c6-a188-47c2-acbf-e9fe477d6c13") : secret "metrics-daemon-secret" not found Apr 20 20:07:28.842152 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:28.842113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:28.842312 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:28.842218 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:28.842312 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:28.842280 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls podName:5999adb7-d895-4660-8bf8-546e2d8dd27c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:44.842265667 +0000 UTC m=+146.965958733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fkttm" (UID: "5999adb7-d895-4660-8bf8-546e2d8dd27c") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:07:29.044176 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.044133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:29.046521 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.046489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"image-registry-5d49c56f64-txz9l\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:29.199605 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.199529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j7xv8\"" Apr 20 20:07:29.208276 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.208256 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:29.329105 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.329071 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d49c56f64-txz9l"] Apr 20 20:07:29.331823 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:29.331794 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6e5b82_3387_4fed_b751_81a011f3b96b.slice/crio-b562bcb50512a33907c7ed3cf93b555b38cd0d09b58ba67c85fddc840a587358 WatchSource:0}: Error finding container b562bcb50512a33907c7ed3cf93b555b38cd0d09b58ba67c85fddc840a587358: Status 404 returned error can't find the container with id b562bcb50512a33907c7ed3cf93b555b38cd0d09b58ba67c85fddc840a587358 Apr 20 20:07:29.839841 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.839806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" event={"ID":"1b6e5b82-3387-4fed-b751-81a011f3b96b","Type":"ContainerStarted","Data":"f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d"} Apr 20 20:07:29.839841 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.839841 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" event={"ID":"1b6e5b82-3387-4fed-b751-81a011f3b96b","Type":"ContainerStarted","Data":"b562bcb50512a33907c7ed3cf93b555b38cd0d09b58ba67c85fddc840a587358"} Apr 20 20:07:29.840063 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.839929 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:07:29.861057 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:29.859100 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" podStartSLOduration=16.859084695 podStartE2EDuration="16.859084695s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:07:29.857911967 +0000 UTC m=+131.981605050" watchObservedRunningTime="2026-04-20 20:07:29.859084695 +0000 UTC m=+131.982777784" Apr 20 20:07:35.461519 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:35.461488 2574 scope.go:117] "RemoveContainer" containerID="826836e8d9c100bcbf666cfa79fe94ca280575807b0bc30ef39f12bb464df692" Apr 20 20:07:35.859082 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:35.859052 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:07:35.859449 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:35.859433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/1.log" Apr 20 20:07:35.859516 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:35.859465 2574 generic.go:358] "Generic (PLEG): container finished" podID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" containerID="836f95ae732f816623ee8a4249735f0526a29e7d68ef992cad9da28273a2a9c4" exitCode=255 Apr 20 20:07:35.859559 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:35.859523 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" event={"ID":"ee1fb92b-6a4c-4495-93a7-29206e6b8642","Type":"ContainerDied","Data":"836f95ae732f816623ee8a4249735f0526a29e7d68ef992cad9da28273a2a9c4"} Apr 20 20:07:35.859559 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:35.859556 2574 scope.go:117] "RemoveContainer" containerID="826836e8d9c100bcbf666cfa79fe94ca280575807b0bc30ef39f12bb464df692" Apr 20 20:07:35.859928 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:35.859907 2574 scope.go:117] "RemoveContainer" containerID="836f95ae732f816623ee8a4249735f0526a29e7d68ef992cad9da28273a2a9c4" Apr 20 20:07:35.860144 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:35.860124 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rxlpd_openshift-console-operator(ee1fb92b-6a4c-4495-93a7-29206e6b8642)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" podUID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" Apr 20 20:07:36.863606 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:36.863580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:07:41.623948 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.623913 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d49c56f64-txz9l"] Apr 20 20:07:41.634884 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.634851 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jpmmm"] Apr 20 20:07:41.637670 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.637651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.640325 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.640304 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:07:41.640438 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.640381 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5f2m5\"" Apr 20 20:07:41.640501 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.640450 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:07:41.648345 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.648325 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jpmmm"] Apr 20 20:07:41.731922 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.731895 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcfpc\" (UniqueName: \"kubernetes.io/projected/5df7a791-739d-44ea-ba16-2093a320d5dd-kube-api-access-rcfpc\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.732085 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.731949 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5df7a791-739d-44ea-ba16-2093a320d5dd-data-volume\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.732085 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.732063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5df7a791-739d-44ea-ba16-2093a320d5dd-crio-socket\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.732202 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.732103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5df7a791-739d-44ea-ba16-2093a320d5dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.732202 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.732184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5df7a791-739d-44ea-ba16-2093a320d5dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.833025 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.832993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5df7a791-739d-44ea-ba16-2093a320d5dd-data-volume\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.833205 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.833074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5df7a791-739d-44ea-ba16-2093a320d5dd-crio-socket\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.833205 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.833100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5df7a791-739d-44ea-ba16-2093a320d5dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.833205 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.833126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5df7a791-739d-44ea-ba16-2093a320d5dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.833205 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.833150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcfpc\" (UniqueName: \"kubernetes.io/projected/5df7a791-739d-44ea-ba16-2093a320d5dd-kube-api-access-rcfpc\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.833359 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.833208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5df7a791-739d-44ea-ba16-2093a320d5dd-crio-socket\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.833359 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.833348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5df7a791-739d-44ea-ba16-2093a320d5dd-data-volume\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.833707 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.833686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5df7a791-739d-44ea-ba16-2093a320d5dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.835641 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.835616 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5df7a791-739d-44ea-ba16-2093a320d5dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.843350 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.843323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcfpc\" (UniqueName: \"kubernetes.io/projected/5df7a791-739d-44ea-ba16-2093a320d5dd-kube-api-access-rcfpc\") pod \"insights-runtime-extractor-jpmmm\" (UID: \"5df7a791-739d-44ea-ba16-2093a320d5dd\") " pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:41.946734 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:41.946662 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jpmmm" Apr 20 20:07:42.062790 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:42.062760 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jpmmm"] Apr 20 20:07:42.065704 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:42.065677 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df7a791_739d_44ea_ba16_2093a320d5dd.slice/crio-05469238bef3aff96f1a39b2f57a67ce047ba3f4a4a0d389ffd5bd90b2df5d6e WatchSource:0}: Error finding container 05469238bef3aff96f1a39b2f57a67ce047ba3f4a4a0d389ffd5bd90b2df5d6e: Status 404 returned error can't find the container with id 05469238bef3aff96f1a39b2f57a67ce047ba3f4a4a0d389ffd5bd90b2df5d6e Apr 20 20:07:42.878705 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:42.878669 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jpmmm" event={"ID":"5df7a791-739d-44ea-ba16-2093a320d5dd","Type":"ContainerStarted","Data":"a03fa2960c4a875435dfb73eab84b075337f65cf4f7aa0bbfd001dc32a8b8d23"} Apr 20 20:07:42.878705 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:42.878706 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jpmmm" event={"ID":"5df7a791-739d-44ea-ba16-2093a320d5dd","Type":"ContainerStarted","Data":"4da209e2b0daad12cf154ca62d262dd23241754256af25cb3ec2681c946bd952"} Apr 20 20:07:42.879116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:42.878716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jpmmm" event={"ID":"5df7a791-739d-44ea-ba16-2093a320d5dd","Type":"ContainerStarted","Data":"05469238bef3aff96f1a39b2f57a67ce047ba3f4a4a0d389ffd5bd90b2df5d6e"} Apr 20 20:07:43.736173 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:43.736133 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:43.736173 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:43.736184 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:07:43.736595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:43.736572 2574 scope.go:117] "RemoveContainer" containerID="836f95ae732f816623ee8a4249735f0526a29e7d68ef992cad9da28273a2a9c4" Apr 20 20:07:43.736814 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:43.736792 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rxlpd_openshift-console-operator(ee1fb92b-6a4c-4495-93a7-29206e6b8642)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" podUID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" Apr 20 20:07:44.857309 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:44.857265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:44.859686 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:44.859667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5999adb7-d895-4660-8bf8-546e2d8dd27c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fkttm\" (UID: \"5999adb7-d895-4660-8bf8-546e2d8dd27c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:44.881942 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:44.881913 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xmmbb\"" Apr 20 20:07:44.885377 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:44.885352 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jpmmm" event={"ID":"5df7a791-739d-44ea-ba16-2093a320d5dd","Type":"ContainerStarted","Data":"e2561ec8abfc1b433f2d1dfd274535330c69c04693ff48d3e9bf804e98431c9b"} Apr 20 20:07:44.889871 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:44.889847 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" Apr 20 20:07:44.901948 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:44.901908 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jpmmm" podStartSLOduration=1.780414284 podStartE2EDuration="3.901891888s" podCreationTimestamp="2026-04-20 20:07:41 +0000 UTC" firstStartedPulling="2026-04-20 20:07:42.116800361 +0000 UTC m=+144.240493427" lastFinishedPulling="2026-04-20 20:07:44.238277965 +0000 UTC m=+146.361971031" observedRunningTime="2026-04-20 20:07:44.901297915 +0000 UTC m=+147.024991028" watchObservedRunningTime="2026-04-20 20:07:44.901891888 +0000 UTC m=+147.025584953" Apr 20 20:07:45.010741 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:45.010707 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm"] Apr 20 20:07:45.013788 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:45.013758 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5999adb7_d895_4660_8bf8_546e2d8dd27c.slice/crio-8c5257c288f18b86e1300fc0a18465350e10699a834f022eaa907cc0e576cd49 WatchSource:0}: Error finding container 8c5257c288f18b86e1300fc0a18465350e10699a834f022eaa907cc0e576cd49: Status 404 returned error can't find the container with id 8c5257c288f18b86e1300fc0a18465350e10699a834f022eaa907cc0e576cd49 Apr 20 20:07:45.889053 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:45.888995 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" event={"ID":"5999adb7-d895-4660-8bf8-546e2d8dd27c","Type":"ContainerStarted","Data":"8c5257c288f18b86e1300fc0a18465350e10699a834f022eaa907cc0e576cd49"} Apr 20 20:07:47.897775 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:47.897735 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" event={"ID":"5999adb7-d895-4660-8bf8-546e2d8dd27c","Type":"ContainerStarted","Data":"19ea953ea630c395e5b8a8159b36ffe23408c44c503c9c2861c93871774641da"} Apr 20 20:07:47.914247 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:47.914199 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fkttm" podStartSLOduration=32.905761553 podStartE2EDuration="34.914187757s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="2026-04-20 20:07:45.015644091 +0000 UTC m=+147.139337158" lastFinishedPulling="2026-04-20 20:07:47.024070295 +0000 UTC m=+149.147763362" observedRunningTime="2026-04-20 20:07:47.914073929 +0000 UTC m=+150.037767018" watchObservedRunningTime="2026-04-20 20:07:47.914187757 +0000 UTC m=+150.037880845" Apr 20 20:07:51.622402 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.622364 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zbp64"] Apr 20 20:07:51.625866 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.625839 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.628689 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.628664 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 20:07:51.628795 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.628733 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-rvbdt\"" Apr 20 20:07:51.629460 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.629438 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 20:07:51.629749 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.629728 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:07:51.630367 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.630339 2574 patch_prober.go:28] interesting pod/image-registry-5d49c56f64-txz9l container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:07:51.630466 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.630391 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" podUID="1b6e5b82-3387-4fed-b751-81a011f3b96b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:07:51.635775 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.635752 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zbp64"] Apr 20 20:07:51.708317 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.708286 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.708460 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.708325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73a10b51-1fea-4aab-81d5-a8e232d4623b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.708460 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.708386 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.708460 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.708436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6pvb\" (UniqueName: \"kubernetes.io/projected/73a10b51-1fea-4aab-81d5-a8e232d4623b-kube-api-access-c6pvb\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.809138 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.809107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.809275 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.809171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6pvb\" (UniqueName: \"kubernetes.io/projected/73a10b51-1fea-4aab-81d5-a8e232d4623b-kube-api-access-c6pvb\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.809275 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.809218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.809275 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.809249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73a10b51-1fea-4aab-81d5-a8e232d4623b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.809275 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:51.809269 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 20:07:51.809420 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:51.809328 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-tls podName:73a10b51-1fea-4aab-81d5-a8e232d4623b nodeName:}" failed. No retries permitted until 2026-04-20 20:07:52.309310226 +0000 UTC m=+154.433003292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-zbp64" (UID: "73a10b51-1fea-4aab-81d5-a8e232d4623b") : secret "prometheus-operator-tls" not found Apr 20 20:07:51.809869 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.809850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73a10b51-1fea-4aab-81d5-a8e232d4623b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.811876 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.811854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:51.818296 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:51.818273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6pvb\" (UniqueName: \"kubernetes.io/projected/73a10b51-1fea-4aab-81d5-a8e232d4623b-kube-api-access-c6pvb\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:52.314640 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:52.314592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:52.317025 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:52.317006 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/73a10b51-1fea-4aab-81d5-a8e232d4623b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zbp64\" (UID: \"73a10b51-1fea-4aab-81d5-a8e232d4623b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:52.536360 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:52.536331 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" Apr 20 20:07:52.668805 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:52.668775 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zbp64"] Apr 20 20:07:52.672198 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:52.672169 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73a10b51_1fea_4aab_81d5_a8e232d4623b.slice/crio-d9997b122e9be579cfae1fec4122c70d63969ea200d3b6ad806460286e83ff7f WatchSource:0}: Error finding container d9997b122e9be579cfae1fec4122c70d63969ea200d3b6ad806460286e83ff7f: Status 404 returned error can't find the container with id d9997b122e9be579cfae1fec4122c70d63969ea200d3b6ad806460286e83ff7f Apr 20 20:07:52.910595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:52.910510 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" event={"ID":"73a10b51-1fea-4aab-81d5-a8e232d4623b","Type":"ContainerStarted","Data":"d9997b122e9be579cfae1fec4122c70d63969ea200d3b6ad806460286e83ff7f"} Apr 20 20:07:54.771219 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:54.771173 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bh6cg" podUID="1c420267-955c-479f-93c5-f3be116a6270" Apr 20 20:07:54.788307 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:54.788272 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-mwhhz" podUID="da8e59f9-df8a-4e18-98ec-09373ec8bee1" Apr 20 20:07:54.916219 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:54.916193 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bh6cg" Apr 20 20:07:54.916359 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:54.916195 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" event={"ID":"73a10b51-1fea-4aab-81d5-a8e232d4623b","Type":"ContainerStarted","Data":"3903d2f3a70a46475ec6309d49ca3e08ecfc0a062028fdee60437d8d6bd3ffc8"} Apr 20 20:07:54.916359 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:54.916311 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" event={"ID":"73a10b51-1fea-4aab-81d5-a8e232d4623b","Type":"ContainerStarted","Data":"de9654e3b16f6a54c262a4d2949731b4f964a112cd63db4b7c8d084711b6539b"} Apr 20 20:07:54.933734 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:54.933685 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-zbp64" podStartSLOduration=2.68153077 podStartE2EDuration="3.933672501s" podCreationTimestamp="2026-04-20 20:07:51 +0000 UTC" firstStartedPulling="2026-04-20 20:07:52.673939179 +0000 UTC m=+154.797632245" lastFinishedPulling="2026-04-20 20:07:53.926080895 +0000 UTC m=+156.049773976" observedRunningTime="2026-04-20 20:07:54.93306819 +0000 UTC m=+157.056761276" watchObservedRunningTime="2026-04-20 20:07:54.933672501 +0000 UTC m=+157.057365589" Apr 20 20:07:55.461664 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:55.461587 2574 scope.go:117] "RemoveContainer" containerID="836f95ae732f816623ee8a4249735f0526a29e7d68ef992cad9da28273a2a9c4" Apr 20 20:07:55.461801 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:55.461754 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-rxlpd_openshift-console-operator(ee1fb92b-6a4c-4495-93a7-29206e6b8642)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" podUID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" Apr 20 20:07:56.475277 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:56.475238 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wktd8" podUID="184c92c6-a188-47c2-acbf-e9fe477d6c13" Apr 20 20:07:56.985490 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:56.985406 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n59wp"] Apr 20 20:07:56.988584 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:56.988557 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:56.991056 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:56.991016 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:07:56.991239 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:56.991106 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-h4n7h\"" Apr 20 20:07:56.991365 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:56.991129 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:07:56.991365 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:56.991169 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:07:57.151362 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-sys\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.151362 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151366 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-textfile\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.151608 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-root\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.151608 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-wtmp\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.151608 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-accelerators-collector-config\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.151767 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151613 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhmtr\" (UniqueName: \"kubernetes.io/projected/c8d21f1e-73fa-43c3-aec2-17f03a870896-kube-api-access-jhmtr\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.151767 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151655 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.151767 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-tls\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.151767 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.151709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d21f1e-73fa-43c3-aec2-17f03a870896-metrics-client-ca\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.252779 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-sys\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.252779 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-textfile\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.252779 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-root\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252806 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-root\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252819 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-sys\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-wtmp\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-accelerators-collector-config\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhmtr\" (UniqueName: \"kubernetes.io/projected/c8d21f1e-73fa-43c3-aec2-17f03a870896-kube-api-access-jhmtr\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.252989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.253018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-tls\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.253070 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-wtmp\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253116 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.253076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d21f1e-73fa-43c3-aec2-17f03a870896-metrics-client-ca\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253557 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.253124 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-textfile\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253557 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:57.253309 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:07:57.253557 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:07:57.253362 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-tls podName:c8d21f1e-73fa-43c3-aec2-17f03a870896 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:57.753345841 +0000 UTC m=+159.877038912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-tls") pod "node-exporter-n59wp" (UID: "c8d21f1e-73fa-43c3-aec2-17f03a870896") : secret "node-exporter-tls" not found Apr 20 20:07:57.253557 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.253453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d21f1e-73fa-43c3-aec2-17f03a870896-metrics-client-ca\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.253557 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.253522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-accelerators-collector-config\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.255916 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.255894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.263175 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.263150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhmtr\" (UniqueName: \"kubernetes.io/projected/c8d21f1e-73fa-43c3-aec2-17f03a870896-kube-api-access-jhmtr\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.757597 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.757564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-tls\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.760002 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.759982 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c8d21f1e-73fa-43c3-aec2-17f03a870896-node-exporter-tls\") pod \"node-exporter-n59wp\" (UID: \"c8d21f1e-73fa-43c3-aec2-17f03a870896\") " pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.899751 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.899712 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n59wp" Apr 20 20:07:57.907791 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:57.907760 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8d21f1e_73fa_43c3_aec2_17f03a870896.slice/crio-ef154fa92cb929a4542f357075a2911c9c05d27eebb9184399de00a3b3290cd2 WatchSource:0}: Error finding container ef154fa92cb929a4542f357075a2911c9c05d27eebb9184399de00a3b3290cd2: Status 404 returned error can't find the container with id ef154fa92cb929a4542f357075a2911c9c05d27eebb9184399de00a3b3290cd2 Apr 20 20:07:57.926357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:57.926329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n59wp" event={"ID":"c8d21f1e-73fa-43c3-aec2-17f03a870896","Type":"ContainerStarted","Data":"ef154fa92cb929a4542f357075a2911c9c05d27eebb9184399de00a3b3290cd2"} Apr 20 20:07:58.930755 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:58.930672 2574 generic.go:358] "Generic (PLEG): container finished" podID="c8d21f1e-73fa-43c3-aec2-17f03a870896" containerID="60a0a13a682c6751548e032d203d0b1c4fbed199a161b2860eb5a14a5fac50c9" exitCode=0 Apr 20 20:07:58.930755 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:58.930725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n59wp" event={"ID":"c8d21f1e-73fa-43c3-aec2-17f03a870896","Type":"ContainerDied","Data":"60a0a13a682c6751548e032d203d0b1c4fbed199a161b2860eb5a14a5fac50c9"} Apr 20 20:07:59.673148 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.673118 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:07:59.673148 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.673152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:07:59.675791 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.675767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da8e59f9-df8a-4e18-98ec-09373ec8bee1-cert\") pod \"ingress-canary-mwhhz\" (UID: \"da8e59f9-df8a-4e18-98ec-09373ec8bee1\") " pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:07:59.676096 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.676074 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c420267-955c-479f-93c5-f3be116a6270-metrics-tls\") pod \"dns-default-bh6cg\" (UID: \"1c420267-955c-479f-93c5-f3be116a6270\") " pod="openshift-dns/dns-default-bh6cg" Apr 20 20:07:59.719278 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.719255 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c4d6d\"" Apr 20 20:07:59.727280 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.727263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bh6cg" Apr 20 20:07:59.850742 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.850710 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bh6cg"] Apr 20 20:07:59.855257 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:07:59.855226 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c420267_955c_479f_93c5_f3be116a6270.slice/crio-573ab055a6116bcb4dd000b3d9ff5df44f22e9bb1315f461e286772cff74abeb WatchSource:0}: Error finding container 573ab055a6116bcb4dd000b3d9ff5df44f22e9bb1315f461e286772cff74abeb: Status 404 returned error can't find the container with id 573ab055a6116bcb4dd000b3d9ff5df44f22e9bb1315f461e286772cff74abeb Apr 20 20:07:59.941309 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.941202 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n59wp" event={"ID":"c8d21f1e-73fa-43c3-aec2-17f03a870896","Type":"ContainerStarted","Data":"df7e97c81e11e66e22cac58f05dc49b177e176d5a7a46f8924492795d9b1d3a9"} Apr 20 20:07:59.941309 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.941251 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n59wp" event={"ID":"c8d21f1e-73fa-43c3-aec2-17f03a870896","Type":"ContainerStarted","Data":"6efa341a1e16676da0bd25e96f5141ec20a2f7c73d645cfde28792e7e97dca1a"} Apr 20 20:07:59.942393 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.942363 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bh6cg" event={"ID":"1c420267-955c-479f-93c5-f3be116a6270","Type":"ContainerStarted","Data":"573ab055a6116bcb4dd000b3d9ff5df44f22e9bb1315f461e286772cff74abeb"} Apr 20 20:07:59.967538 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:07:59.967461 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n59wp" podStartSLOduration=3.248791381 podStartE2EDuration="3.967444873s" podCreationTimestamp="2026-04-20 20:07:56 +0000 UTC" firstStartedPulling="2026-04-20 20:07:57.909672856 +0000 UTC m=+160.033365923" lastFinishedPulling="2026-04-20 20:07:58.628326349 +0000 UTC m=+160.752019415" observedRunningTime="2026-04-20 20:07:59.96676404 +0000 UTC m=+162.090457132" watchObservedRunningTime="2026-04-20 20:07:59.967444873 +0000 UTC m=+162.091137962" Apr 20 20:08:00.202325 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.202240 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh"] Apr 20 20:08:00.207284 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.207264 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.210174 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.210149 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-3dnf2g44c35au\"" Apr 20 20:08:00.210290 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.210196 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 20:08:00.210290 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.210156 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 20:08:00.210404 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.210344 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 20:08:00.210907 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.210889 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 20:08:00.210907 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.210899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-5w8r2\"" Apr 20 20:08:00.211048 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.210990 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 20:08:00.232733 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.232708 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh"] Apr 20 20:08:00.379753 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.379700 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5371b507-255d-419d-a381-5da1d311fb71-metrics-client-ca\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.379931 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.379771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-grpc-tls\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.379931 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.379806 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.379931 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.379835 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqz2w\" (UniqueName: \"kubernetes.io/projected/5371b507-255d-419d-a381-5da1d311fb71-kube-api-access-dqz2w\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.379931 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.379923 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.380126 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.380046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.380126 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.380086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-tls\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.380126 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.380109 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.480547 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.480466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.480547 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.480510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-tls\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.480547 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.480538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.480795 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.480591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5371b507-255d-419d-a381-5da1d311fb71-metrics-client-ca\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.480795 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.480617 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-grpc-tls\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.480795 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.480642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.480795 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.480667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqz2w\" (UniqueName: \"kubernetes.io/projected/5371b507-255d-419d-a381-5da1d311fb71-kube-api-access-dqz2w\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.480795 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.480705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.482201 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.482173 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5371b507-255d-419d-a381-5da1d311fb71-metrics-client-ca\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.483581 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.483535 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.483739 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.483716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.483793 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.483754 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.483867 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.483839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.484255 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.484233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-thanos-querier-tls\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.484323 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.484308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5371b507-255d-419d-a381-5da1d311fb71-secret-grpc-tls\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.490689 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.490667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqz2w\" (UniqueName: \"kubernetes.io/projected/5371b507-255d-419d-a381-5da1d311fb71-kube-api-access-dqz2w\") pod \"thanos-querier-b8bf87fb7-rhsfh\" (UID: \"5371b507-255d-419d-a381-5da1d311fb71\") " pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.516592 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.516557 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:00.659419 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.659389 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh"] Apr 20 20:08:00.663087 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:08:00.663054 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5371b507_255d_419d_a381_5da1d311fb71.slice/crio-1b89af15472c40460283a77678ca25281bfa2096edb0d782b963dd82e63b27c9 WatchSource:0}: Error finding container 1b89af15472c40460283a77678ca25281bfa2096edb0d782b963dd82e63b27c9: Status 404 returned error can't find the container with id 1b89af15472c40460283a77678ca25281bfa2096edb0d782b963dd82e63b27c9 Apr 20 20:08:00.946614 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:00.946578 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" event={"ID":"5371b507-255d-419d-a381-5da1d311fb71","Type":"ContainerStarted","Data":"1b89af15472c40460283a77678ca25281bfa2096edb0d782b963dd82e63b27c9"} Apr 20 20:08:01.414274 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.414245 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-68c68f5db6-2x85q"] Apr 20 20:08:01.418547 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.418303 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.420821 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.420799 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 20:08:01.422635 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.421813 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 20:08:01.422635 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.422239 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-t4kmw\"" Apr 20 20:08:01.422920 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.422902 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 20:08:01.423203 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.423180 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 20:08:01.423356 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.423335 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-flohjndnt0ssm\"" Apr 20 20:08:01.427284 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.427265 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-68c68f5db6-2x85q"] Apr 20 20:08:01.590165 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.590094 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53b5cab7-44b2-44ea-a7a8-e00157572b77-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.590165 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.590139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcchk\" (UniqueName: \"kubernetes.io/projected/53b5cab7-44b2-44ea-a7a8-e00157572b77-kube-api-access-wcchk\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.590362 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.590211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/53b5cab7-44b2-44ea-a7a8-e00157572b77-audit-log\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.590362 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.590239 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-client-ca-bundle\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.590362 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.590274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/53b5cab7-44b2-44ea-a7a8-e00157572b77-metrics-server-audit-profiles\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.590362 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.590333 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-secret-metrics-server-client-certs\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.590578 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.590366 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-secret-metrics-server-tls\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.629735 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.629706 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:08:01.691453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.691418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-secret-metrics-server-tls\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.691620 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.691474 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53b5cab7-44b2-44ea-a7a8-e00157572b77-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.691620 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.691518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcchk\" (UniqueName: \"kubernetes.io/projected/53b5cab7-44b2-44ea-a7a8-e00157572b77-kube-api-access-wcchk\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.691620 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.691606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/53b5cab7-44b2-44ea-a7a8-e00157572b77-audit-log\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.691774 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.691632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-client-ca-bundle\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.691774 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.691667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/53b5cab7-44b2-44ea-a7a8-e00157572b77-metrics-server-audit-profiles\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.691774 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.691714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-secret-metrics-server-client-certs\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.692692 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.692095 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/53b5cab7-44b2-44ea-a7a8-e00157572b77-audit-log\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.692692 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.692647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53b5cab7-44b2-44ea-a7a8-e00157572b77-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.693658 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.693633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/53b5cab7-44b2-44ea-a7a8-e00157572b77-metrics-server-audit-profiles\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.694793 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.694749 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-secret-metrics-server-tls\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.696538 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.696513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-secret-metrics-server-client-certs\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.697900 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.697857 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53b5cab7-44b2-44ea-a7a8-e00157572b77-client-ca-bundle\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.712249 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.712224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcchk\" (UniqueName: \"kubernetes.io/projected/53b5cab7-44b2-44ea-a7a8-e00157572b77-kube-api-access-wcchk\") pod \"metrics-server-68c68f5db6-2x85q\" (UID: \"53b5cab7-44b2-44ea-a7a8-e00157572b77\") " pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.731819 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.731765 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:01.762558 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.762206 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf"] Apr 20 20:08:01.767212 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.766829 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" Apr 20 20:08:01.769818 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.769793 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 20:08:01.770234 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.770001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-pc8vb\"" Apr 20 20:08:01.777333 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.777288 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf"] Apr 20 20:08:01.883542 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.883455 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-68c68f5db6-2x85q"] Apr 20 20:08:01.887178 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:08:01.887148 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b5cab7_44b2_44ea_a7a8_e00157572b77.slice/crio-45b69b9e3328643a45351db63afae0798ebc89745df45b1de7e16640026acbcf WatchSource:0}: Error finding container 45b69b9e3328643a45351db63afae0798ebc89745df45b1de7e16640026acbcf: Status 404 returned error can't find the container with id 45b69b9e3328643a45351db63afae0798ebc89745df45b1de7e16640026acbcf Apr 20 20:08:01.893874 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.893846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/915408af-5ee3-4c92-a8f5-4cf9059a0be9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rbkcf\" (UID: \"915408af-5ee3-4c92-a8f5-4cf9059a0be9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" Apr 20 20:08:01.956792 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.956729 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" event={"ID":"53b5cab7-44b2-44ea-a7a8-e00157572b77","Type":"ContainerStarted","Data":"45b69b9e3328643a45351db63afae0798ebc89745df45b1de7e16640026acbcf"} Apr 20 20:08:01.959134 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.959089 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bh6cg" event={"ID":"1c420267-955c-479f-93c5-f3be116a6270","Type":"ContainerStarted","Data":"8ca0a4fdccde0deb6c72945bbe26199ab668d6f529a4c1d550a0c5d2bb1f2c70"} Apr 20 20:08:01.959134 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.959133 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bh6cg" event={"ID":"1c420267-955c-479f-93c5-f3be116a6270","Type":"ContainerStarted","Data":"bb85d08516deb22313743fd96e4d31a9ff7ae0e8883e39894c098f53badd7219"} Apr 20 20:08:01.959813 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.959792 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bh6cg" Apr 20 20:08:01.979089 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.978972 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bh6cg" podStartSLOduration=129.505148561 podStartE2EDuration="2m10.978944856s" podCreationTimestamp="2026-04-20 20:05:51 +0000 UTC" firstStartedPulling="2026-04-20 20:07:59.857195786 +0000 UTC m=+161.980888852" lastFinishedPulling="2026-04-20 20:08:01.33099207 +0000 UTC m=+163.454685147" observedRunningTime="2026-04-20 20:08:01.977825619 +0000 UTC m=+164.101518709" watchObservedRunningTime="2026-04-20 20:08:01.978944856 +0000 UTC m=+164.102637944" Apr 20 20:08:01.995446 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:01.995411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/915408af-5ee3-4c92-a8f5-4cf9059a0be9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rbkcf\" (UID: \"915408af-5ee3-4c92-a8f5-4cf9059a0be9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" Apr 20 20:08:01.995600 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:08:01.995573 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 20:08:01.995664 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:08:01.995651 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/915408af-5ee3-4c92-a8f5-4cf9059a0be9-monitoring-plugin-cert podName:915408af-5ee3-4c92-a8f5-4cf9059a0be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:02.495631219 +0000 UTC m=+164.619324285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/915408af-5ee3-4c92-a8f5-4cf9059a0be9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-rbkcf" (UID: "915408af-5ee3-4c92-a8f5-4cf9059a0be9") : secret "monitoring-plugin-cert" not found Apr 20 20:08:02.176881 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.176787 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-f69c67d6d-ghrlk"] Apr 20 20:08:02.180687 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.180663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.183527 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.183114 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 20:08:02.183527 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.183373 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 20:08:02.183761 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.183550 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 20:08:02.183826 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.183760 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 20:08:02.183826 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.183776 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 20:08:02.184164 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.184144 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-thzrb\"" Apr 20 20:08:02.188096 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.188021 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 20:08:02.191012 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.190988 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f69c67d6d-ghrlk"] Apr 20 20:08:02.297182 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.297144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-telemeter-client-tls\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.297385 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.297222 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-federate-client-tls\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.297385 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.297284 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-metrics-client-ca\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.297385 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.297312 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.297561 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.297382 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5628\" (UniqueName: \"kubernetes.io/projected/c28d92ab-9264-413d-b83c-a088f773f9d1-kube-api-access-p5628\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.297561 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.297420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-serving-certs-ca-bundle\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.297561 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.297475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-secret-telemeter-client\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.297561 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.297504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.398620 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.398585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-telemeter-client-tls\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.398817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.398627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-federate-client-tls\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.398817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.398661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-metrics-client-ca\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.398817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.398683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.398817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.398718 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5628\" (UniqueName: \"kubernetes.io/projected/c28d92ab-9264-413d-b83c-a088f773f9d1-kube-api-access-p5628\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.398817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.398752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-serving-certs-ca-bundle\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.398817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.398802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-secret-telemeter-client\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.399354 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.398836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.399797 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.399733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-serving-certs-ca-bundle\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.399915 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.399793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-metrics-client-ca\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.399915 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.399805 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28d92ab-9264-413d-b83c-a088f773f9d1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.401777 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.401742 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-federate-client-tls\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.401999 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.401966 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-telemeter-client-tls\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.402127 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.402019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.402391 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.402364 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c28d92ab-9264-413d-b83c-a088f773f9d1-secret-telemeter-client\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.406854 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.406831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5628\" (UniqueName: \"kubernetes.io/projected/c28d92ab-9264-413d-b83c-a088f773f9d1-kube-api-access-p5628\") pod \"telemeter-client-f69c67d6d-ghrlk\" (UID: \"c28d92ab-9264-413d-b83c-a088f773f9d1\") " pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.493432 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.493361 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" Apr 20 20:08:02.499374 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.499345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/915408af-5ee3-4c92-a8f5-4cf9059a0be9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rbkcf\" (UID: \"915408af-5ee3-4c92-a8f5-4cf9059a0be9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" Apr 20 20:08:02.502268 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.502231 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/915408af-5ee3-4c92-a8f5-4cf9059a0be9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rbkcf\" (UID: \"915408af-5ee3-4c92-a8f5-4cf9059a0be9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" Apr 20 20:08:02.682552 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.682510 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" Apr 20 20:08:02.907149 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.907124 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf"] Apr 20 20:08:02.910605 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:08:02.910575 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod915408af_5ee3_4c92_a8f5_4cf9059a0be9.slice/crio-503b8489ec99862e1683ec15c7f29e0e772d0550ebdff811621a3e7cd2f0dfff WatchSource:0}: Error finding container 503b8489ec99862e1683ec15c7f29e0e772d0550ebdff811621a3e7cd2f0dfff: Status 404 returned error can't find the container with id 503b8489ec99862e1683ec15c7f29e0e772d0550ebdff811621a3e7cd2f0dfff Apr 20 20:08:02.937874 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.937816 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f69c67d6d-ghrlk"] Apr 20 20:08:02.954137 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:08:02.954092 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc28d92ab_9264_413d_b83c_a088f773f9d1.slice/crio-f7936449c42576bd2cb9edc58106d1fbbf083624a589ede6d60bd12da10c2dba WatchSource:0}: Error finding container f7936449c42576bd2cb9edc58106d1fbbf083624a589ede6d60bd12da10c2dba: Status 404 returned error can't find the container with id f7936449c42576bd2cb9edc58106d1fbbf083624a589ede6d60bd12da10c2dba Apr 20 20:08:02.962606 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.962577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" event={"ID":"915408af-5ee3-4c92-a8f5-4cf9059a0be9","Type":"ContainerStarted","Data":"503b8489ec99862e1683ec15c7f29e0e772d0550ebdff811621a3e7cd2f0dfff"} Apr 20 20:08:02.963632 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.963609 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" event={"ID":"c28d92ab-9264-413d-b83c-a088f773f9d1","Type":"ContainerStarted","Data":"f7936449c42576bd2cb9edc58106d1fbbf083624a589ede6d60bd12da10c2dba"} Apr 20 20:08:02.964947 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:02.964925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" event={"ID":"5371b507-255d-419d-a381-5da1d311fb71","Type":"ContainerStarted","Data":"b0aec103c488299d4b154957a396990da425bc2a03d80472884abf2502c44ae7"} Apr 20 20:08:03.260597 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.260559 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:08:03.266641 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.266612 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.272695 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.272673 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 20:08:03.273248 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273226 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 20:08:03.273725 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273649 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273867 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ei13ab3vfddsj\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273883 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273671 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2qmfn\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273762 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273799 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.273711 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 20:08:03.274265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.274205 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 20:08:03.274738 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.274476 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 20:08:03.275699 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.275675 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 20:08:03.285269 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.282001 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:08:03.309421 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.309630 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-config-out\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.309735 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309644 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.309735 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.309735 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.309884 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309764 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.309884 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309794 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.309884 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.309884 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309860 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310089 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-web-config\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310089 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.309976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310089 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.310021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jcgk\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-kube-api-access-9jcgk\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310089 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.310071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310213 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.310099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-config\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310213 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.310123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310213 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.310151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310213 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.310172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.310213 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.310200 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.410661 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.410615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-config-out\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.410839 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.410669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.410839 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.410713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.410839 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.410733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.410839 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.410778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411095 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.410935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411095 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.410988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411095 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411251 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-web-config\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411251 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411251 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411251 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jcgk\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-kube-api-access-9jcgk\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411440 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411440 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-config\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411440 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411440 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411373 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411440 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411440 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411429 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411776 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411776 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.411913 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.411844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.412422 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.412259 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.414300 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.414245 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.414755 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.414723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.416134 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.416106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.417223 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.416611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.417223 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.416719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-web-config\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.417223 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.417182 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.418053 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.418008 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.418282 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.418076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.418282 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.418211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-config\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.418282 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.418303 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.419207 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.419164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.419537 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.419506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.419868 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.419843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-config-out\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.420219 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.420197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jcgk\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-kube-api-access-9jcgk\") pod \"prometheus-k8s-0\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.584576 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.584491 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:03.781178 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.780980 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:08:03.786765 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:08:03.786732 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a97961_ac31_4638_8f0a_e43cf467396b.slice/crio-da0c06a91d81327c37990a66def23a411d80bd837defc04e4685e36d074cf6e9 WatchSource:0}: Error finding container da0c06a91d81327c37990a66def23a411d80bd837defc04e4685e36d074cf6e9: Status 404 returned error can't find the container with id da0c06a91d81327c37990a66def23a411d80bd837defc04e4685e36d074cf6e9 Apr 20 20:08:03.971457 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.971338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" event={"ID":"5371b507-255d-419d-a381-5da1d311fb71","Type":"ContainerStarted","Data":"f7cb5b02295b7197dea21e4438c852487d168d1900ff5128e9cf36dd6a3ccceb"} Apr 20 20:08:03.971457 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.971385 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" event={"ID":"5371b507-255d-419d-a381-5da1d311fb71","Type":"ContainerStarted","Data":"0d15ba42ee1d9777205041f7bac1fa9090aedbf9b6eb0c37990664515cf0c00e"} Apr 20 20:08:03.972966 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.972912 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerStarted","Data":"da0c06a91d81327c37990a66def23a411d80bd837defc04e4685e36d074cf6e9"} Apr 20 20:08:03.975207 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.975163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" event={"ID":"53b5cab7-44b2-44ea-a7a8-e00157572b77","Type":"ContainerStarted","Data":"f7456f2170d6b0353977c539ca1922d2185001a8b0960dbe4129a04b4a1b4f45"} Apr 20 20:08:03.998859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:03.998805 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" podStartSLOduration=1.230176863 podStartE2EDuration="2.998784053s" podCreationTimestamp="2026-04-20 20:08:01 +0000 UTC" firstStartedPulling="2026-04-20 20:08:01.889654483 +0000 UTC m=+164.013347549" lastFinishedPulling="2026-04-20 20:08:03.658261658 +0000 UTC m=+165.781954739" observedRunningTime="2026-04-20 20:08:03.997697344 +0000 UTC m=+166.121390447" watchObservedRunningTime="2026-04-20 20:08:03.998784053 +0000 UTC m=+166.122477118" Apr 20 20:08:04.981933 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:04.981890 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" event={"ID":"5371b507-255d-419d-a381-5da1d311fb71","Type":"ContainerStarted","Data":"64a916cb9d0fc877b3159d4c9ec226a325550a9544bcaeedbd8791ac82a55798"} Apr 20 20:08:04.981933 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:04.981940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" event={"ID":"5371b507-255d-419d-a381-5da1d311fb71","Type":"ContainerStarted","Data":"01a201c06178a03c5701499f79cf0a7f6cd496ba2b72345488c45ad73e7c774f"} Apr 20 20:08:04.982425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:04.981953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" event={"ID":"5371b507-255d-419d-a381-5da1d311fb71","Type":"ContainerStarted","Data":"df6d346e5948fb6a7c9fdc5d766c9a9c5b6e378b2ed8643beec930c2fb90c48f"} Apr 20 20:08:05.008047 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.007999 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" podStartSLOduration=1.61652098 podStartE2EDuration="5.007980732s" podCreationTimestamp="2026-04-20 20:08:00 +0000 UTC" firstStartedPulling="2026-04-20 20:08:00.665568744 +0000 UTC m=+162.789261815" lastFinishedPulling="2026-04-20 20:08:04.057028486 +0000 UTC m=+166.180721567" observedRunningTime="2026-04-20 20:08:05.005543906 +0000 UTC m=+167.129236995" watchObservedRunningTime="2026-04-20 20:08:05.007980732 +0000 UTC m=+167.131673824" Apr 20 20:08:05.985986 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.985950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" event={"ID":"915408af-5ee3-4c92-a8f5-4cf9059a0be9","Type":"ContainerStarted","Data":"55862ce8bb28b7433d497c28c925afbbf2463c39d3582670c2d00232ff79ddad"} Apr 20 20:08:05.986464 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.986125 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" Apr 20 20:08:05.988088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.988061 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" event={"ID":"c28d92ab-9264-413d-b83c-a088f773f9d1","Type":"ContainerStarted","Data":"5f0449b16121c9fdbffed5fa049b4f13b47b993b54cfb6db7155fa6a8d493a7a"} Apr 20 20:08:05.988203 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.988092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" event={"ID":"c28d92ab-9264-413d-b83c-a088f773f9d1","Type":"ContainerStarted","Data":"10188ceb2c00d6fb69217634c8fbf74894a71aa97593f72dedf2ac68f34b0e2c"} Apr 20 20:08:05.988203 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.988107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" event={"ID":"c28d92ab-9264-413d-b83c-a088f773f9d1","Type":"ContainerStarted","Data":"f4738c5fd25f9a398f940b0b611984112e24ece96a5a0aaf090b81550c11c3c7"} Apr 20 20:08:05.989536 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.989509 2574 generic.go:358] "Generic (PLEG): container finished" podID="24a97961-ac31-4638-8f0a-e43cf467396b" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" exitCode=0 Apr 20 20:08:05.989666 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.989641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerDied","Data":"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83"} Apr 20 20:08:05.990015 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.989994 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:05.992359 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:05.992344 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" Apr 20 20:08:06.006709 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.006665 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rbkcf" podStartSLOduration=2.36289429 podStartE2EDuration="5.006654952s" podCreationTimestamp="2026-04-20 20:08:01 +0000 UTC" firstStartedPulling="2026-04-20 20:08:02.912880428 +0000 UTC m=+165.036573497" lastFinishedPulling="2026-04-20 20:08:05.556641088 +0000 UTC m=+167.680334159" observedRunningTime="2026-04-20 20:08:06.006422818 +0000 UTC m=+168.130115908" watchObservedRunningTime="2026-04-20 20:08:06.006654952 +0000 UTC m=+168.130348039" Apr 20 20:08:06.027190 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.027150 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-f69c67d6d-ghrlk" podStartSLOduration=1.422279672 podStartE2EDuration="4.027140071s" podCreationTimestamp="2026-04-20 20:08:02 +0000 UTC" firstStartedPulling="2026-04-20 20:08:02.956094626 +0000 UTC m=+165.079787691" lastFinishedPulling="2026-04-20 20:08:05.560955025 +0000 UTC m=+167.684648090" observedRunningTime="2026-04-20 20:08:06.02624989 +0000 UTC m=+168.149942991" watchObservedRunningTime="2026-04-20 20:08:06.027140071 +0000 UTC m=+168.150833160" Apr 20 20:08:06.643835 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.643797 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" podUID="1b6e5b82-3387-4fed-b751-81a011f3b96b" containerName="registry" containerID="cri-o://f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d" gracePeriod=30 Apr 20 20:08:06.900500 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.900438 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:08:06.951431 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951397 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-trusted-ca\") pod \"1b6e5b82-3387-4fed-b751-81a011f3b96b\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " Apr 20 20:08:06.951596 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951464 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-installation-pull-secrets\") pod \"1b6e5b82-3387-4fed-b751-81a011f3b96b\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " Apr 20 20:08:06.951596 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951503 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b6e5b82-3387-4fed-b751-81a011f3b96b-ca-trust-extracted\") pod \"1b6e5b82-3387-4fed-b751-81a011f3b96b\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " Apr 20 20:08:06.951596 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951536 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-bound-sa-token\") pod \"1b6e5b82-3387-4fed-b751-81a011f3b96b\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " Apr 20 20:08:06.951596 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951574 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwxzs\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-kube-api-access-pwxzs\") pod \"1b6e5b82-3387-4fed-b751-81a011f3b96b\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " Apr 20 20:08:06.951822 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951608 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") pod \"1b6e5b82-3387-4fed-b751-81a011f3b96b\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " Apr 20 20:08:06.951822 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951632 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-certificates\") pod \"1b6e5b82-3387-4fed-b751-81a011f3b96b\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " Apr 20 20:08:06.951822 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951663 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-image-registry-private-configuration\") pod \"1b6e5b82-3387-4fed-b751-81a011f3b96b\" (UID: \"1b6e5b82-3387-4fed-b751-81a011f3b96b\") " Apr 20 20:08:06.952010 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.951894 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1b6e5b82-3387-4fed-b751-81a011f3b96b" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:08:06.952323 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.952266 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1b6e5b82-3387-4fed-b751-81a011f3b96b" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:08:06.954609 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.954527 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1b6e5b82-3387-4fed-b751-81a011f3b96b" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:08:06.954726 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.954707 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1b6e5b82-3387-4fed-b751-81a011f3b96b" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:08:06.955013 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.954970 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-kube-api-access-pwxzs" (OuterVolumeSpecName: "kube-api-access-pwxzs") pod "1b6e5b82-3387-4fed-b751-81a011f3b96b" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b"). InnerVolumeSpecName "kube-api-access-pwxzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:06.955278 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.955255 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1b6e5b82-3387-4fed-b751-81a011f3b96b" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:06.956247 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.956221 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1b6e5b82-3387-4fed-b751-81a011f3b96b" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:08:06.962636 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.962608 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b6e5b82-3387-4fed-b751-81a011f3b96b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1b6e5b82-3387-4fed-b751-81a011f3b96b" (UID: "1b6e5b82-3387-4fed-b751-81a011f3b96b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:08:06.994560 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.994509 2574 generic.go:358] "Generic (PLEG): container finished" podID="1b6e5b82-3387-4fed-b751-81a011f3b96b" containerID="f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d" exitCode=0 Apr 20 20:08:06.994988 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.994601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" event={"ID":"1b6e5b82-3387-4fed-b751-81a011f3b96b","Type":"ContainerDied","Data":"f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d"} Apr 20 20:08:06.994988 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.994618 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" Apr 20 20:08:06.994988 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.994643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d49c56f64-txz9l" event={"ID":"1b6e5b82-3387-4fed-b751-81a011f3b96b","Type":"ContainerDied","Data":"b562bcb50512a33907c7ed3cf93b555b38cd0d09b58ba67c85fddc840a587358"} Apr 20 20:08:06.994988 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:06.994663 2574 scope.go:117] "RemoveContainer" containerID="f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d" Apr 20 20:08:07.004103 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.004086 2574 scope.go:117] "RemoveContainer" containerID="f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d" Apr 20 20:08:07.004458 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:08:07.004428 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d\": container with ID starting with f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d not found: ID does not exist" containerID="f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d" Apr 20 20:08:07.004558 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.004468 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d"} err="failed to get container status \"f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d\": rpc error: code = NotFound desc = could not find container \"f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d\": container with ID starting with f5ae08a95bd2bb970eb29a38e8c9306302718ba5ddcb8c7540662a757f442a0d not found: ID does not exist" Apr 20 20:08:07.018398 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.018375 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d49c56f64-txz9l"] Apr 20 20:08:07.022450 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.022428 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d49c56f64-txz9l"] Apr 20 20:08:07.053201 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.053175 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-trusted-ca\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:08:07.053327 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.053204 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-installation-pull-secrets\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:08:07.053327 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.053220 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b6e5b82-3387-4fed-b751-81a011f3b96b-ca-trust-extracted\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:08:07.053327 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.053234 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-bound-sa-token\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:08:07.053327 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.053248 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pwxzs\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-kube-api-access-pwxzs\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:08:07.053327 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.053262 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:08:07.053327 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.053289 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b6e5b82-3387-4fed-b751-81a011f3b96b-registry-certificates\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:08:07.053327 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.053313 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b6e5b82-3387-4fed-b751-81a011f3b96b-image-registry-private-configuration\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:08:07.461001 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:07.460962 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:08:08.462635 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:08.462603 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:08:08.463085 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:08.462955 2574 scope.go:117] "RemoveContainer" containerID="836f95ae732f816623ee8a4249735f0526a29e7d68ef992cad9da28273a2a9c4" Apr 20 20:08:08.465008 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:08.464984 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b6e5b82-3387-4fed-b751-81a011f3b96b" path="/var/lib/kubelet/pods/1b6e5b82-3387-4fed-b751-81a011f3b96b/volumes" Apr 20 20:08:08.465403 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:08.465377 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5cv42\"" Apr 20 20:08:08.473395 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:08.473373 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mwhhz" Apr 20 20:08:09.004453 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:09.004422 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mwhhz"] Apr 20 20:08:09.005175 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:09.005070 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:08:09.005175 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:09.005156 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" event={"ID":"ee1fb92b-6a4c-4495-93a7-29206e6b8642","Type":"ContainerStarted","Data":"ba5e20ffb42eb7984a3a74631b29e7e70db237a501756fc52966a06aed5cf918"} Apr 20 20:08:09.005456 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:09.005421 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:08:09.006531 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:09.006488 2574 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-rxlpd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.11:8443/readyz\": dial tcp 10.134.0.11:8443: connect: connection refused" start-of-body= Apr 20 20:08:09.006620 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:09.006536 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" podUID="ee1fb92b-6a4c-4495-93a7-29206e6b8642" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.11:8443/readyz\": dial tcp 10.134.0.11:8443: connect: connection refused" Apr 20 20:08:09.008241 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:08:09.008190 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda8e59f9_df8a_4e18_98ec_09373ec8bee1.slice/crio-850ab801699b6582fdbc677861be35c42466e5135eae391762282426ca36f80c WatchSource:0}: Error finding container 850ab801699b6582fdbc677861be35c42466e5135eae391762282426ca36f80c: Status 404 returned error can't find the container with id 850ab801699b6582fdbc677861be35c42466e5135eae391762282426ca36f80c Apr 20 20:08:09.022364 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:09.022325 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" podStartSLOduration=53.600267087 podStartE2EDuration="56.022310441s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="2026-04-20 20:07:13.875461443 +0000 UTC m=+115.999154509" lastFinishedPulling="2026-04-20 20:07:16.297504794 +0000 UTC m=+118.421197863" observedRunningTime="2026-04-20 20:08:09.021400186 +0000 UTC m=+171.145093290" watchObservedRunningTime="2026-04-20 20:08:09.022310441 +0000 UTC m=+171.146003523" Apr 20 20:08:10.010023 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.009973 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mwhhz" event={"ID":"da8e59f9-df8a-4e18-98ec-09373ec8bee1","Type":"ContainerStarted","Data":"850ab801699b6582fdbc677861be35c42466e5135eae391762282426ca36f80c"} Apr 20 20:08:10.013482 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.013449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerStarted","Data":"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b"} Apr 20 20:08:10.013618 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.013491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerStarted","Data":"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7"} Apr 20 20:08:10.013618 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.013506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerStarted","Data":"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b"} Apr 20 20:08:10.013618 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.013522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerStarted","Data":"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e"} Apr 20 20:08:10.013618 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.013533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerStarted","Data":"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f"} Apr 20 20:08:10.013618 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.013544 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerStarted","Data":"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d"} Apr 20 20:08:10.018743 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.018717 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-rxlpd" Apr 20 20:08:10.061058 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:10.060988 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.90534304 podStartE2EDuration="7.060971086s" podCreationTimestamp="2026-04-20 20:08:03 +0000 UTC" firstStartedPulling="2026-04-20 20:08:03.789324956 +0000 UTC m=+165.913018037" lastFinishedPulling="2026-04-20 20:08:08.944953014 +0000 UTC m=+171.068646083" observedRunningTime="2026-04-20 20:08:10.059813309 +0000 UTC m=+172.183506410" watchObservedRunningTime="2026-04-20 20:08:10.060971086 +0000 UTC m=+172.184664173" Apr 20 20:08:11.018547 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:11.018509 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mwhhz" event={"ID":"da8e59f9-df8a-4e18-98ec-09373ec8bee1","Type":"ContainerStarted","Data":"27013e214a8c05912876eca67f7c90f2af8905cf10910f991cf730ef22cd7373"} Apr 20 20:08:11.039745 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:11.039696 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mwhhz" podStartSLOduration=138.494692237 podStartE2EDuration="2m20.039681334s" podCreationTimestamp="2026-04-20 20:05:51 +0000 UTC" firstStartedPulling="2026-04-20 20:08:09.010189198 +0000 UTC m=+171.133882268" lastFinishedPulling="2026-04-20 20:08:10.555178299 +0000 UTC m=+172.678871365" observedRunningTime="2026-04-20 20:08:11.038323995 +0000 UTC m=+173.162017087" watchObservedRunningTime="2026-04-20 20:08:11.039681334 +0000 UTC m=+173.163374478" Apr 20 20:08:12.002054 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:12.002005 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-b8bf87fb7-rhsfh" Apr 20 20:08:12.978322 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:12.978284 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bh6cg" Apr 20 20:08:13.585197 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:13.585160 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:21.732265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:21.732229 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:21.732654 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:21.732274 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:22.058499 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:22.058465 2574 generic.go:358] "Generic (PLEG): container finished" podID="78447207-5d22-46b9-9ad3-a68cd998c91a" containerID="eeb3bc1d024f5bb628b154ccec760d817cd4bd73f35cbed28bb927dd33734da7" exitCode=0 Apr 20 20:08:22.058621 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:22.058531 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-74t9l" event={"ID":"78447207-5d22-46b9-9ad3-a68cd998c91a","Type":"ContainerDied","Data":"eeb3bc1d024f5bb628b154ccec760d817cd4bd73f35cbed28bb927dd33734da7"} Apr 20 20:08:22.058914 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:22.058896 2574 scope.go:117] "RemoveContainer" containerID="eeb3bc1d024f5bb628b154ccec760d817cd4bd73f35cbed28bb927dd33734da7" Apr 20 20:08:23.063617 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:23.063579 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-74t9l" event={"ID":"78447207-5d22-46b9-9ad3-a68cd998c91a","Type":"ContainerStarted","Data":"1530b25b7ed48f45cc6faad7ea87bc92b8a686fb6ee3fb2714966a62e81f2c67"} Apr 20 20:08:41.737644 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:41.737608 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:08:41.741515 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:08:41.741491 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-68c68f5db6-2x85q" Apr 20 20:09:03.585692 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:03.585659 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:03.605347 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:03.605323 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:04.196504 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:04.196478 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:21.641817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:21.641781 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:21.642999 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:21.642937 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="prometheus" containerID="cri-o://b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" gracePeriod=600 Apr 20 20:09:21.643231 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:21.642972 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="config-reloader" containerID="cri-o://56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" gracePeriod=600 Apr 20 20:09:21.643351 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:21.642969 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy" containerID="cri-o://e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" gracePeriod=600 Apr 20 20:09:21.643351 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:21.642990 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="thanos-sidecar" containerID="cri-o://8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" gracePeriod=600 Apr 20 20:09:21.643351 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:21.643012 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy-web" containerID="cri-o://0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" gracePeriod=600 Apr 20 20:09:21.643505 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:21.643018 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy-thanos" containerID="cri-o://60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" gracePeriod=600 Apr 20 20:09:21.888427 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:21.888404 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.005764 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005684 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-config-out\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.005764 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005715 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-tls-assets\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.005764 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005756 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005777 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-rulefiles-0\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005794 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-tls\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005808 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-grpc-tls\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005834 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-metrics-client-ca\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005858 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-serving-certs-ca-bundle\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005891 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-web-config\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005917 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-kube-rbac-proxy\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005945 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-config\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.005971 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jcgk\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-kube-api-access-9jcgk\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006050 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.006001 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-thanos-prometheus-http-client-file\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006724 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.006085 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006724 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.006136 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-kubelet-serving-ca-bundle\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006724 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.006175 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-db\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006724 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.006200 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-trusted-ca-bundle\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.006724 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.006232 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-metrics-client-certs\") pod \"24a97961-ac31-4638-8f0a-e43cf467396b\" (UID: \"24a97961-ac31-4638-8f0a-e43cf467396b\") " Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.007071 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.007444 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.007523 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.008972 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.009075 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.009139 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.009338 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.009394 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.009512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.009528 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-config-out" (OuterVolumeSpecName: "config-out") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:22.010284 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.009650 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:22.010864 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.010828 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.010992 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.010902 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.011357 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.011324 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-kube-api-access-9jcgk" (OuterVolumeSpecName: "kube-api-access-9jcgk") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "kube-api-access-9jcgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:22.011541 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.011516 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.012054 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.012005 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.012520 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.012501 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-config" (OuterVolumeSpecName: "config") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.012587 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.012545 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.022383 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.022344 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-web-config" (OuterVolumeSpecName: "web-config") pod "24a97961-ac31-4638-8f0a-e43cf467396b" (UID: "24a97961-ac31-4638-8f0a-e43cf467396b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:22.107620 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107587 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107620 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107615 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107630 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107643 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-grpc-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107655 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-metrics-client-ca\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107668 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107682 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-web-config\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107694 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-kube-rbac-proxy\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107707 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-config\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107720 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9jcgk\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-kube-api-access-9jcgk\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107732 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107747 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107762 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107775 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-k8s-db\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107790 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a97961-ac31-4638-8f0a-e43cf467396b-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.107806 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107803 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/24a97961-ac31-4638-8f0a-e43cf467396b-secret-metrics-client-certs\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.108246 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107817 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24a97961-ac31-4638-8f0a-e43cf467396b-config-out\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.108246 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.107830 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24a97961-ac31-4638-8f0a-e43cf467396b-tls-assets\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:09:22.233088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233056 2574 generic.go:358] "Generic (PLEG): container finished" podID="24a97961-ac31-4638-8f0a-e43cf467396b" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" exitCode=0 Apr 20 20:09:22.233088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233093 2574 generic.go:358] "Generic (PLEG): container finished" podID="24a97961-ac31-4638-8f0a-e43cf467396b" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" exitCode=0 Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233100 2574 generic.go:358] "Generic (PLEG): container finished" podID="24a97961-ac31-4638-8f0a-e43cf467396b" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" exitCode=0 Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233106 2574 generic.go:358] "Generic (PLEG): container finished" podID="24a97961-ac31-4638-8f0a-e43cf467396b" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" exitCode=0 Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233111 2574 generic.go:358] "Generic (PLEG): container finished" podID="24a97961-ac31-4638-8f0a-e43cf467396b" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" exitCode=0 Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233116 2574 generic.go:358] "Generic (PLEG): container finished" podID="24a97961-ac31-4638-8f0a-e43cf467396b" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" exitCode=0 Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerDied","Data":"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b"} Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233204 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerDied","Data":"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7"} Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233219 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerDied","Data":"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b"} Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233229 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerDied","Data":"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e"} Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerDied","Data":"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f"} Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233168 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.233253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233257 2574 scope.go:117] "RemoveContainer" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" Apr 20 20:09:22.233626 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerDied","Data":"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d"} Apr 20 20:09:22.233626 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.233345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"24a97961-ac31-4638-8f0a-e43cf467396b","Type":"ContainerDied","Data":"da0c06a91d81327c37990a66def23a411d80bd837defc04e4685e36d074cf6e9"} Apr 20 20:09:22.241079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.240906 2574 scope.go:117] "RemoveContainer" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" Apr 20 20:09:22.247946 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.247930 2574 scope.go:117] "RemoveContainer" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" Apr 20 20:09:22.254754 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.254735 2574 scope.go:117] "RemoveContainer" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" Apr 20 20:09:22.256399 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.256353 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:22.260566 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.260543 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:22.262358 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.262341 2574 scope.go:117] "RemoveContainer" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" Apr 20 20:09:22.268561 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.268545 2574 scope.go:117] "RemoveContainer" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" Apr 20 20:09:22.275308 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.275288 2574 scope.go:117] "RemoveContainer" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" Apr 20 20:09:22.281558 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.281541 2574 scope.go:117] "RemoveContainer" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" Apr 20 20:09:22.281833 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:09:22.281814 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": container with ID starting with 60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b not found: ID does not exist" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" Apr 20 20:09:22.281883 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.281841 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b"} err="failed to get container status \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": rpc error: code = NotFound desc = could not find container \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": container with ID starting with 60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b not found: ID does not exist" Apr 20 20:09:22.281883 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.281859 2574 scope.go:117] "RemoveContainer" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" Apr 20 20:09:22.282133 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:09:22.282115 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": container with ID starting with e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7 not found: ID does not exist" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" Apr 20 20:09:22.282179 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.282145 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7"} err="failed to get container status \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": rpc error: code = NotFound desc = could not find container \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": container with ID starting with e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7 not found: ID does not exist" Apr 20 20:09:22.282179 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.282164 2574 scope.go:117] "RemoveContainer" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" Apr 20 20:09:22.282412 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:09:22.282398 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": container with ID starting with 0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b not found: ID does not exist" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" Apr 20 20:09:22.282470 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.282417 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b"} err="failed to get container status \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": rpc error: code = NotFound desc = could not find container \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": container with ID starting with 0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b not found: ID does not exist" Apr 20 20:09:22.282470 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.282432 2574 scope.go:117] "RemoveContainer" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" Apr 20 20:09:22.282686 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:09:22.282667 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": container with ID starting with 8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e not found: ID does not exist" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" Apr 20 20:09:22.282727 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.282692 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e"} err="failed to get container status \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": rpc error: code = NotFound desc = could not find container \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": container with ID starting with 8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e not found: ID does not exist" Apr 20 20:09:22.282727 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.282708 2574 scope.go:117] "RemoveContainer" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" Apr 20 20:09:22.282972 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:09:22.282921 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": container with ID starting with 56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f not found: ID does not exist" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" Apr 20 20:09:22.282972 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.282942 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f"} err="failed to get container status \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": rpc error: code = NotFound desc = could not find container \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": container with ID starting with 56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f not found: ID does not exist" Apr 20 20:09:22.282972 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.282963 2574 scope.go:117] "RemoveContainer" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" Apr 20 20:09:22.283376 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:09:22.283311 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": container with ID starting with b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d not found: ID does not exist" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" Apr 20 20:09:22.283376 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.283334 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d"} err="failed to get container status \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": rpc error: code = NotFound desc = could not find container \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": container with ID starting with b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d not found: ID does not exist" Apr 20 20:09:22.283376 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.283357 2574 scope.go:117] "RemoveContainer" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" Apr 20 20:09:22.283610 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:09:22.283587 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": container with ID starting with bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83 not found: ID does not exist" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" Apr 20 20:09:22.283655 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.283619 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83"} err="failed to get container status \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": rpc error: code = NotFound desc = could not find container \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": container with ID starting with bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83 not found: ID does not exist" Apr 20 20:09:22.283655 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.283640 2574 scope.go:117] "RemoveContainer" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" Apr 20 20:09:22.283956 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.283933 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b"} err="failed to get container status \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": rpc error: code = NotFound desc = could not find container \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": container with ID starting with 60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b not found: ID does not exist" Apr 20 20:09:22.283956 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.283958 2574 scope.go:117] "RemoveContainer" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" Apr 20 20:09:22.284187 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.284168 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7"} err="failed to get container status \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": rpc error: code = NotFound desc = could not find container \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": container with ID starting with e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7 not found: ID does not exist" Apr 20 20:09:22.284251 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.284188 2574 scope.go:117] "RemoveContainer" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" Apr 20 20:09:22.284385 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.284360 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b"} err="failed to get container status \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": rpc error: code = NotFound desc = could not find container \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": container with ID starting with 0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b not found: ID does not exist" Apr 20 20:09:22.284385 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.284384 2574 scope.go:117] "RemoveContainer" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" Apr 20 20:09:22.284558 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.284543 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e"} err="failed to get container status \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": rpc error: code = NotFound desc = could not find container \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": container with ID starting with 8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e not found: ID does not exist" Apr 20 20:09:22.284638 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.284557 2574 scope.go:117] "RemoveContainer" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" Apr 20 20:09:22.284804 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.284785 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f"} err="failed to get container status \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": rpc error: code = NotFound desc = could not find container \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": container with ID starting with 56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f not found: ID does not exist" Apr 20 20:09:22.284863 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.284807 2574 scope.go:117] "RemoveContainer" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" Apr 20 20:09:22.285058 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285017 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d"} err="failed to get container status \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": rpc error: code = NotFound desc = could not find container \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": container with ID starting with b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d not found: ID does not exist" Apr 20 20:09:22.285131 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285060 2574 scope.go:117] "RemoveContainer" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" Apr 20 20:09:22.285245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285227 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:22.285299 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285263 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83"} err="failed to get container status \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": rpc error: code = NotFound desc = could not find container \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": container with ID starting with bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83 not found: ID does not exist" Apr 20 20:09:22.285299 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285277 2574 scope.go:117] "RemoveContainer" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" Apr 20 20:09:22.285484 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285470 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b"} err="failed to get container status \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": rpc error: code = NotFound desc = could not find container \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": container with ID starting with 60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b not found: ID does not exist" Apr 20 20:09:22.285542 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285484 2574 scope.go:117] "RemoveContainer" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" Apr 20 20:09:22.285600 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285582 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="init-config-reloader" Apr 20 20:09:22.285650 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285605 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="init-config-reloader" Apr 20 20:09:22.285650 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285620 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy" Apr 20 20:09:22.285650 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285630 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy" Apr 20 20:09:22.285650 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285644 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="prometheus" Apr 20 20:09:22.285650 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285643 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7"} err="failed to get container status \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": rpc error: code = NotFound desc = could not find container \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": container with ID starting with e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7 not found: ID does not exist" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285653 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="prometheus" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285656 2574 scope.go:117] "RemoveContainer" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285662 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy-thanos" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285673 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy-thanos" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285692 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="config-reloader" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285701 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="config-reloader" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285709 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="thanos-sidecar" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285717 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="thanos-sidecar" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285732 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b6e5b82-3387-4fed-b751-81a011f3b96b" containerName="registry" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285741 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6e5b82-3387-4fed-b751-81a011f3b96b" containerName="registry" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285751 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy-web" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285760 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy-web" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285831 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy-thanos" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285846 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="thanos-sidecar" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285846 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b"} err="failed to get container status \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": rpc error: code = NotFound desc = could not find container \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": container with ID starting with 0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b not found: ID does not exist" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285858 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy-web" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285863 2574 scope.go:117] "RemoveContainer" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285869 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b6e5b82-3387-4fed-b751-81a011f3b96b" containerName="registry" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285879 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="config-reloader" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285890 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="kube-rbac-proxy" Apr 20 20:09:22.285892 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.285902 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" containerName="prometheus" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286023 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e"} err="failed to get container status \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": rpc error: code = NotFound desc = could not find container \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": container with ID starting with 8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e not found: ID does not exist" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286053 2574 scope.go:117] "RemoveContainer" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286249 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f"} err="failed to get container status \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": rpc error: code = NotFound desc = could not find container \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": container with ID starting with 56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f not found: ID does not exist" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286264 2574 scope.go:117] "RemoveContainer" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286437 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d"} err="failed to get container status \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": rpc error: code = NotFound desc = could not find container \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": container with ID starting with b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d not found: ID does not exist" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286451 2574 scope.go:117] "RemoveContainer" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286623 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83"} err="failed to get container status \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": rpc error: code = NotFound desc = could not find container \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": container with ID starting with bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83 not found: ID does not exist" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286637 2574 scope.go:117] "RemoveContainer" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286818 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b"} err="failed to get container status \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": rpc error: code = NotFound desc = could not find container \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": container with ID starting with 60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b not found: ID does not exist" Apr 20 20:09:22.286859 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.286833 2574 scope.go:117] "RemoveContainer" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" Apr 20 20:09:22.287361 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287023 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7"} err="failed to get container status \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": rpc error: code = NotFound desc = could not find container \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": container with ID starting with e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7 not found: ID does not exist" Apr 20 20:09:22.287361 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287052 2574 scope.go:117] "RemoveContainer" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" Apr 20 20:09:22.287361 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287241 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b"} err="failed to get container status \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": rpc error: code = NotFound desc = could not find container \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": container with ID starting with 0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b not found: ID does not exist" Apr 20 20:09:22.287361 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287257 2574 scope.go:117] "RemoveContainer" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" Apr 20 20:09:22.287580 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287430 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e"} err="failed to get container status \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": rpc error: code = NotFound desc = could not find container \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": container with ID starting with 8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e not found: ID does not exist" Apr 20 20:09:22.287580 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287443 2574 scope.go:117] "RemoveContainer" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" Apr 20 20:09:22.287644 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287618 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f"} err="failed to get container status \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": rpc error: code = NotFound desc = could not find container \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": container with ID starting with 56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f not found: ID does not exist" Apr 20 20:09:22.287644 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287634 2574 scope.go:117] "RemoveContainer" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" Apr 20 20:09:22.287840 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287824 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d"} err="failed to get container status \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": rpc error: code = NotFound desc = could not find container \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": container with ID starting with b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d not found: ID does not exist" Apr 20 20:09:22.287840 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287839 2574 scope.go:117] "RemoveContainer" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" Apr 20 20:09:22.288005 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.287991 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83"} err="failed to get container status \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": rpc error: code = NotFound desc = could not find container \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": container with ID starting with bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83 not found: ID does not exist" Apr 20 20:09:22.288068 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288005 2574 scope.go:117] "RemoveContainer" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" Apr 20 20:09:22.288220 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288200 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b"} err="failed to get container status \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": rpc error: code = NotFound desc = could not find container \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": container with ID starting with 60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b not found: ID does not exist" Apr 20 20:09:22.288278 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288223 2574 scope.go:117] "RemoveContainer" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" Apr 20 20:09:22.288420 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288403 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7"} err="failed to get container status \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": rpc error: code = NotFound desc = could not find container \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": container with ID starting with e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7 not found: ID does not exist" Apr 20 20:09:22.288478 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288421 2574 scope.go:117] "RemoveContainer" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" Apr 20 20:09:22.288612 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288595 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b"} err="failed to get container status \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": rpc error: code = NotFound desc = could not find container \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": container with ID starting with 0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b not found: ID does not exist" Apr 20 20:09:22.288652 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288612 2574 scope.go:117] "RemoveContainer" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" Apr 20 20:09:22.288815 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288797 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e"} err="failed to get container status \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": rpc error: code = NotFound desc = could not find container \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": container with ID starting with 8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e not found: ID does not exist" Apr 20 20:09:22.288940 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288817 2574 scope.go:117] "RemoveContainer" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" Apr 20 20:09:22.289051 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.288988 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f"} err="failed to get container status \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": rpc error: code = NotFound desc = could not find container \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": container with ID starting with 56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f not found: ID does not exist" Apr 20 20:09:22.289051 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.289008 2574 scope.go:117] "RemoveContainer" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" Apr 20 20:09:22.289285 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.289263 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d"} err="failed to get container status \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": rpc error: code = NotFound desc = could not find container \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": container with ID starting with b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d not found: ID does not exist" Apr 20 20:09:22.289285 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.289285 2574 scope.go:117] "RemoveContainer" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" Apr 20 20:09:22.289527 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.289499 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83"} err="failed to get container status \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": rpc error: code = NotFound desc = could not find container \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": container with ID starting with bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83 not found: ID does not exist" Apr 20 20:09:22.289527 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.289526 2574 scope.go:117] "RemoveContainer" containerID="60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b" Apr 20 20:09:22.289789 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.289766 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b"} err="failed to get container status \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": rpc error: code = NotFound desc = could not find container \"60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b\": container with ID starting with 60667920b61e0ee0110e9912e31ba948535e8d248c708909b80cba002192267b not found: ID does not exist" Apr 20 20:09:22.289865 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.289791 2574 scope.go:117] "RemoveContainer" containerID="e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7" Apr 20 20:09:22.290100 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.289991 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7"} err="failed to get container status \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": rpc error: code = NotFound desc = could not find container \"e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7\": container with ID starting with e2f7669f38715560fcd73ee5f3e83fafdc58291c0a985f939f41e1ca6e5e8af7 not found: ID does not exist" Apr 20 20:09:22.290100 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.290017 2574 scope.go:117] "RemoveContainer" containerID="0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b" Apr 20 20:09:22.291928 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.291900 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b"} err="failed to get container status \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": rpc error: code = NotFound desc = could not find container \"0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b\": container with ID starting with 0f84f53cb6eb141dc04a06f28686ebdf13df29df37b1a55f73b91dc333083d6b not found: ID does not exist" Apr 20 20:09:22.291928 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.291928 2574 scope.go:117] "RemoveContainer" containerID="8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e" Apr 20 20:09:22.292109 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.291982 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.292175 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.292149 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e"} err="failed to get container status \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": rpc error: code = NotFound desc = could not find container \"8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e\": container with ID starting with 8b350f28684f54f947f60e7fdd10d5c7f6d23638f5a016e821b89158bfdc7f0e not found: ID does not exist" Apr 20 20:09:22.292232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.292176 2574 scope.go:117] "RemoveContainer" containerID="56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f" Apr 20 20:09:22.292488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.292457 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f"} err="failed to get container status \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": rpc error: code = NotFound desc = could not find container \"56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f\": container with ID starting with 56c49368e04f4a7513ed5801cd4341ed17ba38908e2621bfccbff012f5141f7f not found: ID does not exist" Apr 20 20:09:22.292488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.292484 2574 scope.go:117] "RemoveContainer" containerID="b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d" Apr 20 20:09:22.292773 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.292746 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d"} err="failed to get container status \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": rpc error: code = NotFound desc = could not find container \"b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d\": container with ID starting with b3e0ae59b6a2973f28c9a7c7dfc410e1a9482d38788aae2191940148dfe3849d not found: ID does not exist" Apr 20 20:09:22.292773 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.292773 2574 scope.go:117] "RemoveContainer" containerID="bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83" Apr 20 20:09:22.292996 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.292979 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83"} err="failed to get container status \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": rpc error: code = NotFound desc = could not find container \"bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83\": container with ID starting with bd0e54223c09d90132829a840e0aa63dcc6fb1e21ec477b2cda1717ae2b57b83 not found: ID does not exist" Apr 20 20:09:22.294643 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.294621 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 20:09:22.294738 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.294662 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 20:09:22.294738 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.294690 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 20:09:22.294854 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.294764 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 20:09:22.294854 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.294823 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 20:09:22.295195 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.295176 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 20:09:22.295268 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.295192 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 20:09:22.295268 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.295197 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ei13ab3vfddsj\"" Apr 20 20:09:22.295268 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.295200 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 20:09:22.295508 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.295494 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 20:09:22.296493 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.296155 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 20:09:22.298472 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.298439 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2qmfn\"" Apr 20 20:09:22.305380 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.305349 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 20:09:22.306336 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.306314 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:22.306813 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.306796 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 20:09:22.410490 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410459 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vg2\" (UniqueName: \"kubernetes.io/projected/6fb70db0-02ba-449a-b860-74bc0fe90c9d-kube-api-access-k4vg2\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410490 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410681 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410517 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410681 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410681 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410681 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410681 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410829 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fb70db0-02ba-449a-b860-74bc0fe90c9d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410829 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410829 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410829 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-config\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410829 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410809 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fb70db0-02ba-449a-b860-74bc0fe90c9d-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.410829 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.411003 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.411003 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410907 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.411003 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.411003 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.411003 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.410997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.464457 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.464429 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a97961-ac31-4638-8f0a-e43cf467396b" path="/var/lib/kubelet/pods/24a97961-ac31-4638-8f0a-e43cf467396b/volumes" Apr 20 20:09:22.511877 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.511804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.511877 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.511853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.512088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.511904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.512088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.511930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.512088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.511952 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fb70db0-02ba-449a-b860-74bc0fe90c9d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.512088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.511971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.512088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.511988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.512088 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.512017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-config\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.512699 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.512670 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.513076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fb70db0-02ba-449a-b860-74bc0fe90c9d-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.513223 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.513351 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.513463 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513450 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513546 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513628 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vg2\" (UniqueName: \"kubernetes.io/projected/6fb70db0-02ba-449a-b860-74bc0fe90c9d-kube-api-access-k4vg2\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514463 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.514274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.514463 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513084 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.515488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.515349 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-config\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.515488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.515399 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.515488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.515403 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fb70db0-02ba-449a-b860-74bc0fe90c9d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.515488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.513063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.515488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.515424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fb70db0-02ba-449a-b860-74bc0fe90c9d-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.515488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.515427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.515488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.515453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.516793 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.516771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.517224 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.517200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.517860 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.517838 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.518084 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.518067 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.518299 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.518277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fb70db0-02ba-449a-b860-74bc0fe90c9d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.518475 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.518461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fb70db0-02ba-449a-b860-74bc0fe90c9d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.523404 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.523385 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vg2\" (UniqueName: \"kubernetes.io/projected/6fb70db0-02ba-449a-b860-74bc0fe90c9d-kube-api-access-k4vg2\") pod \"prometheus-k8s-0\" (UID: \"6fb70db0-02ba-449a-b860-74bc0fe90c9d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.605972 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.605940 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:22.737534 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:22.737286 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:22.740057 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:09:22.740007 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fb70db0_02ba_449a_b860_74bc0fe90c9d.slice/crio-66e0c8f12ae24a99b182426709402b6db2c2e83808f7a82034583864bd4860b7 WatchSource:0}: Error finding container 66e0c8f12ae24a99b182426709402b6db2c2e83808f7a82034583864bd4860b7: Status 404 returned error can't find the container with id 66e0c8f12ae24a99b182426709402b6db2c2e83808f7a82034583864bd4860b7 Apr 20 20:09:23.237275 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:23.237189 2574 generic.go:358] "Generic (PLEG): container finished" podID="6fb70db0-02ba-449a-b860-74bc0fe90c9d" containerID="33cd59dd567dde364d0c6d540376dac83ecce5f7e4d32eb0fc4501ea9bcc9107" exitCode=0 Apr 20 20:09:23.237426 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:23.237278 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fb70db0-02ba-449a-b860-74bc0fe90c9d","Type":"ContainerDied","Data":"33cd59dd567dde364d0c6d540376dac83ecce5f7e4d32eb0fc4501ea9bcc9107"} Apr 20 20:09:23.237426 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:23.237314 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fb70db0-02ba-449a-b860-74bc0fe90c9d","Type":"ContainerStarted","Data":"66e0c8f12ae24a99b182426709402b6db2c2e83808f7a82034583864bd4860b7"} Apr 20 20:09:24.244605 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:24.244573 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fb70db0-02ba-449a-b860-74bc0fe90c9d","Type":"ContainerStarted","Data":"d8aad5afa88bb719b326da609ec272c6c0afe2e35ae43bbfba151f5211c70471"} Apr 20 20:09:24.244605 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:24.244610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fb70db0-02ba-449a-b860-74bc0fe90c9d","Type":"ContainerStarted","Data":"7fc85cdc973d7f9d5cd4ac6a4ff86d1b8331adb9445086ccb4bf85f427ee9166"} Apr 20 20:09:24.244990 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:24.244620 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fb70db0-02ba-449a-b860-74bc0fe90c9d","Type":"ContainerStarted","Data":"f0269fd7217f34ab4f030ce59dfe4794e1d3d4c55678092491db9d6a86ce0ca6"} Apr 20 20:09:24.244990 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:24.244629 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fb70db0-02ba-449a-b860-74bc0fe90c9d","Type":"ContainerStarted","Data":"1c189e44e35b5d1662e92b86537548a028188895e9d847781b3d0a6776cfd439"} Apr 20 20:09:24.244990 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:24.244638 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fb70db0-02ba-449a-b860-74bc0fe90c9d","Type":"ContainerStarted","Data":"70131ace55d8c8795f3ffc0845f573ce5a9919762cfeaf9d3b53827d6f0622c6"} Apr 20 20:09:24.244990 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:24.244648 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fb70db0-02ba-449a-b860-74bc0fe90c9d","Type":"ContainerStarted","Data":"a46cc3cbfc9f2677d9bae7027ca8b0376dc77f4c7be70f7e3ebab7aeb8cb6019"} Apr 20 20:09:24.274836 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:24.274779 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.27476034 podStartE2EDuration="2.27476034s" podCreationTimestamp="2026-04-20 20:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:09:24.270659062 +0000 UTC m=+246.394352152" watchObservedRunningTime="2026-04-20 20:09:24.27476034 +0000 UTC m=+246.398453432" Apr 20 20:09:27.606489 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:27.606412 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:30.185870 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:30.185825 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:09:30.188315 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:30.188294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184c92c6-a188-47c2-acbf-e9fe477d6c13-metrics-certs\") pod \"network-metrics-daemon-wktd8\" (UID: \"184c92c6-a188-47c2-acbf-e9fe477d6c13\") " pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:09:30.263724 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:30.263694 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt5w7\"" Apr 20 20:09:30.271692 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:30.271664 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wktd8" Apr 20 20:09:30.393289 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:30.393262 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wktd8"] Apr 20 20:09:30.397010 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:09:30.396978 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184c92c6_a188_47c2_acbf_e9fe477d6c13.slice/crio-04fb2a29aa83ba716246fa668230401b9703c4cbfd59b2853c1c524e3b01d42b WatchSource:0}: Error finding container 04fb2a29aa83ba716246fa668230401b9703c4cbfd59b2853c1c524e3b01d42b: Status 404 returned error can't find the container with id 04fb2a29aa83ba716246fa668230401b9703c4cbfd59b2853c1c524e3b01d42b Apr 20 20:09:31.267699 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:31.267651 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wktd8" event={"ID":"184c92c6-a188-47c2-acbf-e9fe477d6c13","Type":"ContainerStarted","Data":"04fb2a29aa83ba716246fa668230401b9703c4cbfd59b2853c1c524e3b01d42b"} Apr 20 20:09:32.272595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:32.272550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wktd8" event={"ID":"184c92c6-a188-47c2-acbf-e9fe477d6c13","Type":"ContainerStarted","Data":"abaeb9fc557705701e5b11ac21890a2d4f25bab6ef63ad787d0305065e59a516"} Apr 20 20:09:32.272595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:32.272591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wktd8" event={"ID":"184c92c6-a188-47c2-acbf-e9fe477d6c13","Type":"ContainerStarted","Data":"f0b4b686b2516cc66b7c93f851adc465fd4d216d762acd93c6203c86d6218e16"} Apr 20 20:09:41.493197 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.493139 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wktd8" podStartSLOduration=262.39446142 podStartE2EDuration="4m23.493121289s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:09:30.398758851 +0000 UTC m=+252.522451917" lastFinishedPulling="2026-04-20 20:09:31.497418719 +0000 UTC m=+253.621111786" observedRunningTime="2026-04-20 20:09:32.292239542 +0000 UTC m=+254.415932631" watchObservedRunningTime="2026-04-20 20:09:41.493121289 +0000 UTC m=+263.616814415" Apr 20 20:09:41.494542 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.494518 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-r5zhr"] Apr 20 20:09:41.498073 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.498025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.500316 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.500297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:09:41.507450 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.507426 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r5zhr"] Apr 20 20:09:41.568721 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.568684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/abe21bdd-fc08-4152-a25c-837a2c251a36-kubelet-config\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.568882 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.568781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/abe21bdd-fc08-4152-a25c-837a2c251a36-dbus\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.568882 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.568813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/abe21bdd-fc08-4152-a25c-837a2c251a36-original-pull-secret\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.669354 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.669321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/abe21bdd-fc08-4152-a25c-837a2c251a36-dbus\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.669525 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.669362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/abe21bdd-fc08-4152-a25c-837a2c251a36-original-pull-secret\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.669525 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.669392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/abe21bdd-fc08-4152-a25c-837a2c251a36-kubelet-config\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.669525 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.669457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/abe21bdd-fc08-4152-a25c-837a2c251a36-kubelet-config\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.669648 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.669530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/abe21bdd-fc08-4152-a25c-837a2c251a36-dbus\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.671781 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.671754 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/abe21bdd-fc08-4152-a25c-837a2c251a36-original-pull-secret\") pod \"global-pull-secret-syncer-r5zhr\" (UID: \"abe21bdd-fc08-4152-a25c-837a2c251a36\") " pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.807446 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.807416 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r5zhr" Apr 20 20:09:41.951491 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:41.951459 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r5zhr"] Apr 20 20:09:41.954539 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:09:41.954514 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe21bdd_fc08_4152_a25c_837a2c251a36.slice/crio-8a1d1e92358ae639b39b82f7511370048d8631652696b79a3f65fa7331587e7f WatchSource:0}: Error finding container 8a1d1e92358ae639b39b82f7511370048d8631652696b79a3f65fa7331587e7f: Status 404 returned error can't find the container with id 8a1d1e92358ae639b39b82f7511370048d8631652696b79a3f65fa7331587e7f Apr 20 20:09:42.303072 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:42.303015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r5zhr" event={"ID":"abe21bdd-fc08-4152-a25c-837a2c251a36","Type":"ContainerStarted","Data":"8a1d1e92358ae639b39b82f7511370048d8631652696b79a3f65fa7331587e7f"} Apr 20 20:09:46.316754 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:46.316717 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r5zhr" event={"ID":"abe21bdd-fc08-4152-a25c-837a2c251a36","Type":"ContainerStarted","Data":"9b2fd59217cfaa425074e95a43afb613976a07a491398ec60c92622199760edc"} Apr 20 20:09:46.332388 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:09:46.332338 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-r5zhr" podStartSLOduration=1.459576311 podStartE2EDuration="5.33232466s" podCreationTimestamp="2026-04-20 20:09:41 +0000 UTC" firstStartedPulling="2026-04-20 20:09:41.956660517 +0000 UTC m=+264.080353592" lastFinishedPulling="2026-04-20 20:09:45.829408875 +0000 UTC m=+267.953101941" observedRunningTime="2026-04-20 20:09:46.330807279 +0000 UTC m=+268.454500370" watchObservedRunningTime="2026-04-20 20:09:46.33232466 +0000 UTC m=+268.456017747" Apr 20 20:10:18.328768 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:10:18.328740 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:10:18.329303 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:10:18.329011 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:10:18.334383 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:10:18.334360 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:10:18.334930 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:10:18.334901 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:10:18.339945 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:10:18.339928 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:10:22.606492 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:10:22.606457 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:22.622532 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:10:22.622503 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:23.443253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:10:23.443227 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:13:09.945985 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:09.945951 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-vsmvr"] Apr 20 20:13:09.949019 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:09.949002 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vsmvr" Apr 20 20:13:09.951385 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:09.951359 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 20 20:13:09.951492 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:09.951360 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 20 20:13:09.951492 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:09.951359 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-8lht9\"" Apr 20 20:13:09.952132 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:09.952118 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 20 20:13:09.957703 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:09.957679 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-vsmvr"] Apr 20 20:13:10.012571 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:10.012535 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j42qc\" (UniqueName: \"kubernetes.io/projected/821b92e8-06a5-445e-8eef-c473c8b4846d-kube-api-access-j42qc\") pod \"s3-init-vsmvr\" (UID: \"821b92e8-06a5-445e-8eef-c473c8b4846d\") " pod="kserve/s3-init-vsmvr" Apr 20 20:13:10.113512 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:10.113480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j42qc\" (UniqueName: \"kubernetes.io/projected/821b92e8-06a5-445e-8eef-c473c8b4846d-kube-api-access-j42qc\") pod \"s3-init-vsmvr\" (UID: \"821b92e8-06a5-445e-8eef-c473c8b4846d\") " pod="kserve/s3-init-vsmvr" Apr 20 20:13:10.122694 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:10.122669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j42qc\" (UniqueName: \"kubernetes.io/projected/821b92e8-06a5-445e-8eef-c473c8b4846d-kube-api-access-j42qc\") pod \"s3-init-vsmvr\" (UID: \"821b92e8-06a5-445e-8eef-c473c8b4846d\") " pod="kserve/s3-init-vsmvr" Apr 20 20:13:10.267185 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:10.267154 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vsmvr" Apr 20 20:13:10.393020 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:10.392996 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-vsmvr"] Apr 20 20:13:10.395677 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:13:10.395645 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod821b92e8_06a5_445e_8eef_c473c8b4846d.slice/crio-7e1b30fbb8f5e3843ac14f590e9331efb05d2235e4edd45d6041f940c6d2fdc8 WatchSource:0}: Error finding container 7e1b30fbb8f5e3843ac14f590e9331efb05d2235e4edd45d6041f940c6d2fdc8: Status 404 returned error can't find the container with id 7e1b30fbb8f5e3843ac14f590e9331efb05d2235e4edd45d6041f940c6d2fdc8 Apr 20 20:13:10.397392 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:10.397376 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:13:10.902350 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:10.902296 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vsmvr" event={"ID":"821b92e8-06a5-445e-8eef-c473c8b4846d","Type":"ContainerStarted","Data":"7e1b30fbb8f5e3843ac14f590e9331efb05d2235e4edd45d6041f940c6d2fdc8"} Apr 20 20:13:15.919801 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:15.919764 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vsmvr" event={"ID":"821b92e8-06a5-445e-8eef-c473c8b4846d","Type":"ContainerStarted","Data":"e16c27fd4a9b7f733868e83a381e7c84dc4d072f720104752bb0c2f3e7765a4c"} Apr 20 20:13:15.935533 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:15.935473 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-vsmvr" podStartSLOduration=2.208896095 podStartE2EDuration="6.935458335s" podCreationTimestamp="2026-04-20 20:13:09 +0000 UTC" firstStartedPulling="2026-04-20 20:13:10.397567102 +0000 UTC m=+472.521260171" lastFinishedPulling="2026-04-20 20:13:15.124129341 +0000 UTC m=+477.247822411" observedRunningTime="2026-04-20 20:13:15.934263327 +0000 UTC m=+478.057956416" watchObservedRunningTime="2026-04-20 20:13:15.935458335 +0000 UTC m=+478.059151424" Apr 20 20:13:18.929720 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:18.929680 2574 generic.go:358] "Generic (PLEG): container finished" podID="821b92e8-06a5-445e-8eef-c473c8b4846d" containerID="e16c27fd4a9b7f733868e83a381e7c84dc4d072f720104752bb0c2f3e7765a4c" exitCode=0 Apr 20 20:13:18.930109 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:18.929748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vsmvr" event={"ID":"821b92e8-06a5-445e-8eef-c473c8b4846d","Type":"ContainerDied","Data":"e16c27fd4a9b7f733868e83a381e7c84dc4d072f720104752bb0c2f3e7765a4c"} Apr 20 20:13:20.053901 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:20.053880 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vsmvr" Apr 20 20:13:20.095018 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:20.094990 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j42qc\" (UniqueName: \"kubernetes.io/projected/821b92e8-06a5-445e-8eef-c473c8b4846d-kube-api-access-j42qc\") pod \"821b92e8-06a5-445e-8eef-c473c8b4846d\" (UID: \"821b92e8-06a5-445e-8eef-c473c8b4846d\") " Apr 20 20:13:20.097219 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:20.097190 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821b92e8-06a5-445e-8eef-c473c8b4846d-kube-api-access-j42qc" (OuterVolumeSpecName: "kube-api-access-j42qc") pod "821b92e8-06a5-445e-8eef-c473c8b4846d" (UID: "821b92e8-06a5-445e-8eef-c473c8b4846d"). InnerVolumeSpecName "kube-api-access-j42qc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:13:20.196022 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:20.195951 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j42qc\" (UniqueName: \"kubernetes.io/projected/821b92e8-06a5-445e-8eef-c473c8b4846d-kube-api-access-j42qc\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:13:20.937347 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:20.937312 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vsmvr" event={"ID":"821b92e8-06a5-445e-8eef-c473c8b4846d","Type":"ContainerDied","Data":"7e1b30fbb8f5e3843ac14f590e9331efb05d2235e4edd45d6041f940c6d2fdc8"} Apr 20 20:13:20.937347 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:20.937347 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e1b30fbb8f5e3843ac14f590e9331efb05d2235e4edd45d6041f940c6d2fdc8" Apr 20 20:13:20.937544 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:20.937345 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vsmvr" Apr 20 20:13:30.262778 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.262738 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf"] Apr 20 20:13:30.263424 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.263110 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="821b92e8-06a5-445e-8eef-c473c8b4846d" containerName="s3-init" Apr 20 20:13:30.263424 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.263121 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="821b92e8-06a5-445e-8eef-c473c8b4846d" containerName="s3-init" Apr 20 20:13:30.263424 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.263190 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="821b92e8-06a5-445e-8eef-c473c8b4846d" containerName="s3-init" Apr 20 20:13:30.266255 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.266238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.268463 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.268427 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 20 20:13:30.268599 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.268432 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 20 20:13:30.268668 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.268604 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lqk4h\"" Apr 20 20:13:30.268668 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.268603 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:13:30.269239 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.269218 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:13:30.276029 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.276005 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf"] Apr 20 20:13:30.376964 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.376936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e927b850-b5b5-4cea-862e-4e4d1d93c247-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.377143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.376972 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4kn\" (UniqueName: \"kubernetes.io/projected/e927b850-b5b5-4cea-862e-4e4d1d93c247-kube-api-access-pt4kn\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.377143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.377004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e927b850-b5b5-4cea-862e-4e4d1d93c247-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.377143 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.377074 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.478028 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.477996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.478216 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.478085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e927b850-b5b5-4cea-862e-4e4d1d93c247-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.478216 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.478124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4kn\" (UniqueName: \"kubernetes.io/projected/e927b850-b5b5-4cea-862e-4e4d1d93c247-kube-api-access-pt4kn\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.478216 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:13:30.478162 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-predictor-serving-cert: secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 20 20:13:30.478379 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.478165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e927b850-b5b5-4cea-862e-4e4d1d93c247-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.478379 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:13:30.478246 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls podName:e927b850-b5b5-4cea-862e-4e4d1d93c247 nodeName:}" failed. No retries permitted until 2026-04-20 20:13:30.978224222 +0000 UTC m=+493.101917287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls") pod "isvc-xgboost-graph-predictor-669d8d6456-52hdf" (UID: "e927b850-b5b5-4cea-862e-4e4d1d93c247") : secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 20 20:13:30.478743 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.478721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e927b850-b5b5-4cea-862e-4e4d1d93c247-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.478822 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.478804 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e927b850-b5b5-4cea-862e-4e4d1d93c247-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.487085 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.487062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4kn\" (UniqueName: \"kubernetes.io/projected/e927b850-b5b5-4cea-862e-4e4d1d93c247-kube-api-access-pt4kn\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.983904 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.983871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:30.986351 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:30.986332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-52hdf\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:31.179421 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:31.179386 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:13:31.305275 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:31.305248 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf"] Apr 20 20:13:31.307653 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:13:31.307620 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode927b850_b5b5_4cea_862e_4e4d1d93c247.slice/crio-3bfc8dd734acd010fcd9b462298b1a42c42d1c899a566bac084122572d494f13 WatchSource:0}: Error finding container 3bfc8dd734acd010fcd9b462298b1a42c42d1c899a566bac084122572d494f13: Status 404 returned error can't find the container with id 3bfc8dd734acd010fcd9b462298b1a42c42d1c899a566bac084122572d494f13 Apr 20 20:13:31.972007 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:31.971962 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" event={"ID":"e927b850-b5b5-4cea-862e-4e4d1d93c247","Type":"ContainerStarted","Data":"3bfc8dd734acd010fcd9b462298b1a42c42d1c899a566bac084122572d494f13"} Apr 20 20:13:35.985749 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:35.985703 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" event={"ID":"e927b850-b5b5-4cea-862e-4e4d1d93c247","Type":"ContainerStarted","Data":"0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e"} Apr 20 20:13:39.998694 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:39.998658 2574 generic.go:358] "Generic (PLEG): container finished" podID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerID="0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e" exitCode=0 Apr 20 20:13:39.999095 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:13:39.998717 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" event={"ID":"e927b850-b5b5-4cea-862e-4e4d1d93c247","Type":"ContainerDied","Data":"0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e"} Apr 20 20:14:07.086100 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:07.086030 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" event={"ID":"e927b850-b5b5-4cea-862e-4e4d1d93c247","Type":"ContainerStarted","Data":"95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c"} Apr 20 20:14:09.094310 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:09.094277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" event={"ID":"e927b850-b5b5-4cea-862e-4e4d1d93c247","Type":"ContainerStarted","Data":"4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf"} Apr 20 20:14:09.094700 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:09.094460 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:14:09.094700 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:09.094594 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:14:09.095853 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:09.095821 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 20 20:14:09.113122 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:09.113073 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podStartSLOduration=2.064066472 podStartE2EDuration="39.113056793s" podCreationTimestamp="2026-04-20 20:13:30 +0000 UTC" firstStartedPulling="2026-04-20 20:13:31.309522912 +0000 UTC m=+493.433215979" lastFinishedPulling="2026-04-20 20:14:08.358513235 +0000 UTC m=+530.482206300" observedRunningTime="2026-04-20 20:14:09.111729542 +0000 UTC m=+531.235422642" watchObservedRunningTime="2026-04-20 20:14:09.113056793 +0000 UTC m=+531.236749875" Apr 20 20:14:10.097797 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:10.097755 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 20 20:14:15.101710 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:15.101680 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:14:15.102149 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:15.102118 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 20 20:14:25.102320 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:25.102276 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 20 20:14:35.102326 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:35.102285 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 20 20:14:45.102786 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:45.102747 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 20 20:14:49.973643 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:49.973606 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29"] Apr 20 20:14:50.007640 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.007608 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29"] Apr 20 20:14:50.007796 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.007729 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:50.014832 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.014809 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-127f1-kube-rbac-proxy-sar-config\"" Apr 20 20:14:50.014947 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.014858 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-127f1-serving-cert\"" Apr 20 20:14:50.176524 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.176487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5bf96b6-b305-4d4a-a71d-28594dd96238-openshift-service-ca-bundle\") pod \"switch-graph-127f1-5bd7f94589-flj29\" (UID: \"d5bf96b6-b305-4d4a-a71d-28594dd96238\") " pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:50.176699 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.176548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5bf96b6-b305-4d4a-a71d-28594dd96238-proxy-tls\") pod \"switch-graph-127f1-5bd7f94589-flj29\" (UID: \"d5bf96b6-b305-4d4a-a71d-28594dd96238\") " pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:50.277747 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.277697 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5bf96b6-b305-4d4a-a71d-28594dd96238-openshift-service-ca-bundle\") pod \"switch-graph-127f1-5bd7f94589-flj29\" (UID: \"d5bf96b6-b305-4d4a-a71d-28594dd96238\") " pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:50.277909 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.277776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5bf96b6-b305-4d4a-a71d-28594dd96238-proxy-tls\") pod \"switch-graph-127f1-5bd7f94589-flj29\" (UID: \"d5bf96b6-b305-4d4a-a71d-28594dd96238\") " pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:50.278367 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.278346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5bf96b6-b305-4d4a-a71d-28594dd96238-openshift-service-ca-bundle\") pod \"switch-graph-127f1-5bd7f94589-flj29\" (UID: \"d5bf96b6-b305-4d4a-a71d-28594dd96238\") " pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:50.280296 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.280279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5bf96b6-b305-4d4a-a71d-28594dd96238-proxy-tls\") pod \"switch-graph-127f1-5bd7f94589-flj29\" (UID: \"d5bf96b6-b305-4d4a-a71d-28594dd96238\") " pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:50.318267 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.318242 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:50.440412 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:50.440379 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29"] Apr 20 20:14:50.443880 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:14:50.443847 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5bf96b6_b305_4d4a_a71d_28594dd96238.slice/crio-38f3fa61d6d057a5477d7c01971b520c73bfad35f0b837153fa9fd8b759c8c7b WatchSource:0}: Error finding container 38f3fa61d6d057a5477d7c01971b520c73bfad35f0b837153fa9fd8b759c8c7b: Status 404 returned error can't find the container with id 38f3fa61d6d057a5477d7c01971b520c73bfad35f0b837153fa9fd8b759c8c7b Apr 20 20:14:51.227482 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:51.227448 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" event={"ID":"d5bf96b6-b305-4d4a-a71d-28594dd96238","Type":"ContainerStarted","Data":"38f3fa61d6d057a5477d7c01971b520c73bfad35f0b837153fa9fd8b759c8c7b"} Apr 20 20:14:53.236482 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:53.236444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" event={"ID":"d5bf96b6-b305-4d4a-a71d-28594dd96238","Type":"ContainerStarted","Data":"e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c"} Apr 20 20:14:53.236883 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:53.236592 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:14:53.253803 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:53.253749 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" podStartSLOduration=2.249889656 podStartE2EDuration="4.253736064s" podCreationTimestamp="2026-04-20 20:14:49 +0000 UTC" firstStartedPulling="2026-04-20 20:14:50.445772431 +0000 UTC m=+572.569465497" lastFinishedPulling="2026-04-20 20:14:52.449618825 +0000 UTC m=+574.573311905" observedRunningTime="2026-04-20 20:14:53.251831201 +0000 UTC m=+575.375524291" watchObservedRunningTime="2026-04-20 20:14:53.253736064 +0000 UTC m=+575.377429151" Apr 20 20:14:55.102661 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:55.102614 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 20 20:14:59.245248 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:14:59.245220 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:15:00.156647 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:00.156612 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29"] Apr 20 20:15:00.156902 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:00.156857 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" containerID="cri-o://e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c" gracePeriod=30 Apr 20 20:15:04.243906 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:04.243864 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:05.103211 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:05.103176 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:15:09.243839 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:09.243803 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:14.244192 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:14.244146 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:14.244583 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:14.244255 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:15:18.359258 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:18.359227 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:15:18.360451 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:18.360427 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:15:18.364964 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:18.364945 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:15:18.365784 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:18.365767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:15:19.243783 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:19.243745 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:24.243568 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:24.243530 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:29.243578 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:29.243487 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:29.945883 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:29.945842 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-46tnx"] Apr 20 20:15:29.947902 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:29.947885 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:29.950176 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:29.950148 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 20 20:15:29.950370 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:29.950358 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 20 20:15:29.957949 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:29.957919 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-46tnx"] Apr 20 20:15:30.024426 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.024392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddf372af-98f8-41fc-a69c-f432c159fdbe-openshift-service-ca-bundle\") pod \"model-chainer-76487779f8-46tnx\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:30.024596 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.024455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls\") pod \"model-chainer-76487779f8-46tnx\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:30.125272 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.125234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddf372af-98f8-41fc-a69c-f432c159fdbe-openshift-service-ca-bundle\") pod \"model-chainer-76487779f8-46tnx\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:30.125426 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.125303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls\") pod \"model-chainer-76487779f8-46tnx\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:30.125480 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:15:30.125446 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 20 20:15:30.125545 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:15:30.125532 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls podName:ddf372af-98f8-41fc-a69c-f432c159fdbe nodeName:}" failed. No retries permitted until 2026-04-20 20:15:30.625511611 +0000 UTC m=+612.749204684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls") pod "model-chainer-76487779f8-46tnx" (UID: "ddf372af-98f8-41fc-a69c-f432c159fdbe") : secret "model-chainer-serving-cert" not found Apr 20 20:15:30.125961 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.125943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddf372af-98f8-41fc-a69c-f432c159fdbe-openshift-service-ca-bundle\") pod \"model-chainer-76487779f8-46tnx\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:30.188065 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:15:30.187857 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5bf96b6_b305_4d4a_a71d_28594dd96238.slice/crio-e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5bf96b6_b305_4d4a_a71d_28594dd96238.slice/crio-conmon-e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:15:30.188065 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:15:30.187995 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5bf96b6_b305_4d4a_a71d_28594dd96238.slice/crio-conmon-e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:15:30.301421 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.301398 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:15:30.326310 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.326279 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5bf96b6-b305-4d4a-a71d-28594dd96238-openshift-service-ca-bundle\") pod \"d5bf96b6-b305-4d4a-a71d-28594dd96238\" (UID: \"d5bf96b6-b305-4d4a-a71d-28594dd96238\") " Apr 20 20:15:30.326456 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.326382 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5bf96b6-b305-4d4a-a71d-28594dd96238-proxy-tls\") pod \"d5bf96b6-b305-4d4a-a71d-28594dd96238\" (UID: \"d5bf96b6-b305-4d4a-a71d-28594dd96238\") " Apr 20 20:15:30.326640 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.326617 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5bf96b6-b305-4d4a-a71d-28594dd96238-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d5bf96b6-b305-4d4a-a71d-28594dd96238" (UID: "d5bf96b6-b305-4d4a-a71d-28594dd96238"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:30.328615 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.328594 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5bf96b6-b305-4d4a-a71d-28594dd96238-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d5bf96b6-b305-4d4a-a71d-28594dd96238" (UID: "d5bf96b6-b305-4d4a-a71d-28594dd96238"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:30.351375 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.351343 2574 generic.go:358] "Generic (PLEG): container finished" podID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerID="e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c" exitCode=0 Apr 20 20:15:30.351474 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.351408 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" Apr 20 20:15:30.351474 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.351432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" event={"ID":"d5bf96b6-b305-4d4a-a71d-28594dd96238","Type":"ContainerDied","Data":"e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c"} Apr 20 20:15:30.351474 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.351468 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29" event={"ID":"d5bf96b6-b305-4d4a-a71d-28594dd96238","Type":"ContainerDied","Data":"38f3fa61d6d057a5477d7c01971b520c73bfad35f0b837153fa9fd8b759c8c7b"} Apr 20 20:15:30.351572 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.351483 2574 scope.go:117] "RemoveContainer" containerID="e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c" Apr 20 20:15:30.360226 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.360203 2574 scope.go:117] "RemoveContainer" containerID="e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c" Apr 20 20:15:30.360500 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:15:30.360481 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c\": container with ID starting with e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c not found: ID does not exist" containerID="e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c" Apr 20 20:15:30.360557 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.360509 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c"} err="failed to get container status \"e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c\": rpc error: code = NotFound desc = could not find container \"e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c\": container with ID starting with e519d58da72688376943eb02061324ab7812ce1559fb6ac705a0512b6872354c not found: ID does not exist" Apr 20 20:15:30.371772 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.371749 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29"] Apr 20 20:15:30.374551 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.374529 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-127f1-5bd7f94589-flj29"] Apr 20 20:15:30.427467 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.427438 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5bf96b6-b305-4d4a-a71d-28594dd96238-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:15:30.427856 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.427473 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5bf96b6-b305-4d4a-a71d-28594dd96238-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:15:30.465626 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.465590 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" path="/var/lib/kubelet/pods/d5bf96b6-b305-4d4a-a71d-28594dd96238/volumes" Apr 20 20:15:30.629917 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.629863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls\") pod \"model-chainer-76487779f8-46tnx\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:30.632388 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.632360 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls\") pod \"model-chainer-76487779f8-46tnx\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:30.859027 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.858982 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:30.977993 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:30.977961 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-46tnx"] Apr 20 20:15:30.980646 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:15:30.980606 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf372af_98f8_41fc_a69c_f432c159fdbe.slice/crio-45c4a7faf77ed803c41ae11ac0ec0ceeabc28aeb227f3fa15aaa705ea684ed7c WatchSource:0}: Error finding container 45c4a7faf77ed803c41ae11ac0ec0ceeabc28aeb227f3fa15aaa705ea684ed7c: Status 404 returned error can't find the container with id 45c4a7faf77ed803c41ae11ac0ec0ceeabc28aeb227f3fa15aaa705ea684ed7c Apr 20 20:15:31.357152 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:31.357116 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" event={"ID":"ddf372af-98f8-41fc-a69c-f432c159fdbe","Type":"ContainerStarted","Data":"632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d"} Apr 20 20:15:31.357152 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:31.357156 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" event={"ID":"ddf372af-98f8-41fc-a69c-f432c159fdbe","Type":"ContainerStarted","Data":"45c4a7faf77ed803c41ae11ac0ec0ceeabc28aeb227f3fa15aaa705ea684ed7c"} Apr 20 20:15:31.357654 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:31.357252 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:31.374023 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:31.373971 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" podStartSLOduration=2.37395352 podStartE2EDuration="2.37395352s" podCreationTimestamp="2026-04-20 20:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:15:31.372585028 +0000 UTC m=+613.496278116" watchObservedRunningTime="2026-04-20 20:15:31.37395352 +0000 UTC m=+613.497646610" Apr 20 20:15:37.365685 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:37.365653 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:40.137639 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:40.137606 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-46tnx"] Apr 20 20:15:40.138058 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:40.137833 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" containerID="cri-o://632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d" gracePeriod=30 Apr 20 20:15:40.596555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:40.596520 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf"] Apr 20 20:15:40.596932 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:40.596903 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" containerID="cri-o://95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c" gracePeriod=30 Apr 20 20:15:40.597079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:40.596917 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kube-rbac-proxy" containerID="cri-o://4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf" gracePeriod=30 Apr 20 20:15:41.388352 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:41.388316 2574 generic.go:358] "Generic (PLEG): container finished" podID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerID="4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf" exitCode=2 Apr 20 20:15:41.388700 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:41.388361 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" event={"ID":"e927b850-b5b5-4cea-862e-4e4d1d93c247","Type":"ContainerDied","Data":"4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf"} Apr 20 20:15:42.364774 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:42.364735 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:44.136102 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.136074 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:15:44.227397 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.227320 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt4kn\" (UniqueName: \"kubernetes.io/projected/e927b850-b5b5-4cea-862e-4e4d1d93c247-kube-api-access-pt4kn\") pod \"e927b850-b5b5-4cea-862e-4e4d1d93c247\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " Apr 20 20:15:44.227397 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.227366 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls\") pod \"e927b850-b5b5-4cea-862e-4e4d1d93c247\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " Apr 20 20:15:44.227614 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.227398 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e927b850-b5b5-4cea-862e-4e4d1d93c247-kserve-provision-location\") pod \"e927b850-b5b5-4cea-862e-4e4d1d93c247\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " Apr 20 20:15:44.227614 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.227481 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e927b850-b5b5-4cea-862e-4e4d1d93c247-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"e927b850-b5b5-4cea-862e-4e4d1d93c247\" (UID: \"e927b850-b5b5-4cea-862e-4e4d1d93c247\") " Apr 20 20:15:44.227749 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.227699 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e927b850-b5b5-4cea-862e-4e4d1d93c247-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e927b850-b5b5-4cea-862e-4e4d1d93c247" (UID: "e927b850-b5b5-4cea-862e-4e4d1d93c247"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:15:44.227887 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.227860 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e927b850-b5b5-4cea-862e-4e4d1d93c247-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "e927b850-b5b5-4cea-862e-4e4d1d93c247" (UID: "e927b850-b5b5-4cea-862e-4e4d1d93c247"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:44.229567 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.229536 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e927b850-b5b5-4cea-862e-4e4d1d93c247" (UID: "e927b850-b5b5-4cea-862e-4e4d1d93c247"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:44.229677 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.229595 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e927b850-b5b5-4cea-862e-4e4d1d93c247-kube-api-access-pt4kn" (OuterVolumeSpecName: "kube-api-access-pt4kn") pod "e927b850-b5b5-4cea-862e-4e4d1d93c247" (UID: "e927b850-b5b5-4cea-862e-4e4d1d93c247"). InnerVolumeSpecName "kube-api-access-pt4kn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:44.328796 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.328759 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e927b850-b5b5-4cea-862e-4e4d1d93c247-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:15:44.328796 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.328793 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pt4kn\" (UniqueName: \"kubernetes.io/projected/e927b850-b5b5-4cea-862e-4e4d1d93c247-kube-api-access-pt4kn\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:15:44.328972 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.328808 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e927b850-b5b5-4cea-862e-4e4d1d93c247-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:15:44.328972 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.328821 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e927b850-b5b5-4cea-862e-4e4d1d93c247-kserve-provision-location\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:15:44.399763 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.399724 2574 generic.go:358] "Generic (PLEG): container finished" podID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerID="95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c" exitCode=0 Apr 20 20:15:44.399897 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.399828 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" event={"ID":"e927b850-b5b5-4cea-862e-4e4d1d93c247","Type":"ContainerDied","Data":"95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c"} Apr 20 20:15:44.399897 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.399853 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" event={"ID":"e927b850-b5b5-4cea-862e-4e4d1d93c247","Type":"ContainerDied","Data":"3bfc8dd734acd010fcd9b462298b1a42c42d1c899a566bac084122572d494f13"} Apr 20 20:15:44.399897 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.399860 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf" Apr 20 20:15:44.399996 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.399868 2574 scope.go:117] "RemoveContainer" containerID="4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf" Apr 20 20:15:44.408527 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.408506 2574 scope.go:117] "RemoveContainer" containerID="95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c" Apr 20 20:15:44.415628 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.415611 2574 scope.go:117] "RemoveContainer" containerID="0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e" Apr 20 20:15:44.422569 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.422552 2574 scope.go:117] "RemoveContainer" containerID="4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf" Apr 20 20:15:44.422728 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.422709 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf"] Apr 20 20:15:44.422828 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:15:44.422809 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf\": container with ID starting with 4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf not found: ID does not exist" containerID="4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf" Apr 20 20:15:44.422887 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.422838 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf"} err="failed to get container status \"4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf\": rpc error: code = NotFound desc = could not find container \"4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf\": container with ID starting with 4b89cb8ddc30ead78576d6260867f50f6b52aa7f2a1574e327ae266d55cb3baf not found: ID does not exist" Apr 20 20:15:44.422887 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.422863 2574 scope.go:117] "RemoveContainer" containerID="95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c" Apr 20 20:15:44.423114 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:15:44.423093 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c\": container with ID starting with 95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c not found: ID does not exist" containerID="95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c" Apr 20 20:15:44.423179 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.423120 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c"} err="failed to get container status \"95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c\": rpc error: code = NotFound desc = could not find container \"95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c\": container with ID starting with 95053ccab1c65877ce9e61d438738e569fa120a740ddc17867e2d3eeaaf2a22c not found: ID does not exist" Apr 20 20:15:44.423179 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.423135 2574 scope.go:117] "RemoveContainer" containerID="0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e" Apr 20 20:15:44.423363 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:15:44.423343 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e\": container with ID starting with 0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e not found: ID does not exist" containerID="0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e" Apr 20 20:15:44.423404 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.423367 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e"} err="failed to get container status \"0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e\": rpc error: code = NotFound desc = could not find container \"0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e\": container with ID starting with 0f7f58564f91812118fe1231dd81380c40357171393df704d26c4bd25cfbe96e not found: ID does not exist" Apr 20 20:15:44.426628 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.426611 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-52hdf"] Apr 20 20:15:44.465558 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:44.465523 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" path="/var/lib/kubelet/pods/e927b850-b5b5-4cea-862e-4e4d1d93c247/volumes" Apr 20 20:15:47.364977 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:47.364939 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:52.364696 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:52.364647 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:52.365255 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:52.364782 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:15:57.364271 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:15:57.364227 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:00.406701 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.406666 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47"] Apr 20 20:16:00.407076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.406999 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" Apr 20 20:16:00.407076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407011 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" Apr 20 20:16:00.407076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407022 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" Apr 20 20:16:00.407076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407028 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" Apr 20 20:16:00.407076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407055 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kube-rbac-proxy" Apr 20 20:16:00.407076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407061 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kube-rbac-proxy" Apr 20 20:16:00.407267 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407083 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="storage-initializer" Apr 20 20:16:00.407267 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407089 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="storage-initializer" Apr 20 20:16:00.407267 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407139 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5bf96b6-b305-4d4a-a71d-28594dd96238" containerName="switch-graph-127f1" Apr 20 20:16:00.407267 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407148 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kserve-container" Apr 20 20:16:00.407267 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.407156 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e927b850-b5b5-4cea-862e-4e4d1d93c247" containerName="kube-rbac-proxy" Apr 20 20:16:00.409850 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.409834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:00.412077 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.412051 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9290e-serving-cert\"" Apr 20 20:16:00.412077 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.412055 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9290e-kube-rbac-proxy-sar-config\"" Apr 20 20:16:00.418993 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.418965 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47"] Apr 20 20:16:00.564280 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.564246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls\") pod \"switch-graph-9290e-8cc455c84-hqz47\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:00.564443 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.564314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e06f30b-363e-4c4a-aa80-a89ac2540654-openshift-service-ca-bundle\") pod \"switch-graph-9290e-8cc455c84-hqz47\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:00.665242 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.665153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e06f30b-363e-4c4a-aa80-a89ac2540654-openshift-service-ca-bundle\") pod \"switch-graph-9290e-8cc455c84-hqz47\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:00.665242 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.665220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls\") pod \"switch-graph-9290e-8cc455c84-hqz47\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:00.665475 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:16:00.665307 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-9290e-serving-cert: secret "switch-graph-9290e-serving-cert" not found Apr 20 20:16:00.665475 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:16:00.665361 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls podName:7e06f30b-363e-4c4a-aa80-a89ac2540654 nodeName:}" failed. No retries permitted until 2026-04-20 20:16:01.165345018 +0000 UTC m=+643.289038084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls") pod "switch-graph-9290e-8cc455c84-hqz47" (UID: "7e06f30b-363e-4c4a-aa80-a89ac2540654") : secret "switch-graph-9290e-serving-cert" not found Apr 20 20:16:00.665886 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:00.665859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e06f30b-363e-4c4a-aa80-a89ac2540654-openshift-service-ca-bundle\") pod \"switch-graph-9290e-8cc455c84-hqz47\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:01.169465 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:01.169426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls\") pod \"switch-graph-9290e-8cc455c84-hqz47\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:01.171995 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:01.171966 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls\") pod \"switch-graph-9290e-8cc455c84-hqz47\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:01.321006 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:01.320965 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:01.445215 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:01.445187 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47"] Apr 20 20:16:01.447757 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:16:01.447729 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e06f30b_363e_4c4a_aa80_a89ac2540654.slice/crio-b084e03bf3ffb845c9898c339c56139565080ce13137fabb55a086238a7b7a60 WatchSource:0}: Error finding container b084e03bf3ffb845c9898c339c56139565080ce13137fabb55a086238a7b7a60: Status 404 returned error can't find the container with id b084e03bf3ffb845c9898c339c56139565080ce13137fabb55a086238a7b7a60 Apr 20 20:16:01.455783 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:01.455760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" event={"ID":"7e06f30b-363e-4c4a-aa80-a89ac2540654","Type":"ContainerStarted","Data":"b084e03bf3ffb845c9898c339c56139565080ce13137fabb55a086238a7b7a60"} Apr 20 20:16:02.364477 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:02.364437 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:02.459713 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:02.459676 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" event={"ID":"7e06f30b-363e-4c4a-aa80-a89ac2540654","Type":"ContainerStarted","Data":"3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d"} Apr 20 20:16:02.460163 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:02.459791 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:02.478813 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:02.478764 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" podStartSLOduration=2.4787510360000002 podStartE2EDuration="2.478751036s" podCreationTimestamp="2026-04-20 20:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:16:02.47692866 +0000 UTC m=+644.600621751" watchObservedRunningTime="2026-04-20 20:16:02.478751036 +0000 UTC m=+644.602444124" Apr 20 20:16:07.364294 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:07.364248 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:08.468576 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:08.468550 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:16:10.275238 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.275215 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:16:10.443909 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.443831 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls\") pod \"ddf372af-98f8-41fc-a69c-f432c159fdbe\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " Apr 20 20:16:10.443909 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.443878 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddf372af-98f8-41fc-a69c-f432c159fdbe-openshift-service-ca-bundle\") pod \"ddf372af-98f8-41fc-a69c-f432c159fdbe\" (UID: \"ddf372af-98f8-41fc-a69c-f432c159fdbe\") " Apr 20 20:16:10.444299 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.444269 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf372af-98f8-41fc-a69c-f432c159fdbe-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ddf372af-98f8-41fc-a69c-f432c159fdbe" (UID: "ddf372af-98f8-41fc-a69c-f432c159fdbe"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:16:10.446047 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.446011 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ddf372af-98f8-41fc-a69c-f432c159fdbe" (UID: "ddf372af-98f8-41fc-a69c-f432c159fdbe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:16:10.482626 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.482596 2574 generic.go:358] "Generic (PLEG): container finished" podID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerID="632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d" exitCode=0 Apr 20 20:16:10.482771 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.482663 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" Apr 20 20:16:10.482771 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.482680 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" event={"ID":"ddf372af-98f8-41fc-a69c-f432c159fdbe","Type":"ContainerDied","Data":"632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d"} Apr 20 20:16:10.482771 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.482716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-76487779f8-46tnx" event={"ID":"ddf372af-98f8-41fc-a69c-f432c159fdbe","Type":"ContainerDied","Data":"45c4a7faf77ed803c41ae11ac0ec0ceeabc28aeb227f3fa15aaa705ea684ed7c"} Apr 20 20:16:10.482771 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.482731 2574 scope.go:117] "RemoveContainer" containerID="632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d" Apr 20 20:16:10.490930 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.490913 2574 scope.go:117] "RemoveContainer" containerID="632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d" Apr 20 20:16:10.491238 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:16:10.491215 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d\": container with ID starting with 632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d not found: ID does not exist" containerID="632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d" Apr 20 20:16:10.491309 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.491249 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d"} err="failed to get container status \"632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d\": rpc error: code = NotFound desc = could not find container \"632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d\": container with ID starting with 632175ac53a98efb1cb42bae4ad207c58827eb2503ffa5402b7918168d44c82d not found: ID does not exist" Apr 20 20:16:10.498014 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.497992 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-46tnx"] Apr 20 20:16:10.501173 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.501152 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-76487779f8-46tnx"] Apr 20 20:16:10.545049 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.545005 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ddf372af-98f8-41fc-a69c-f432c159fdbe-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:16:10.545147 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:10.545056 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddf372af-98f8-41fc-a69c-f432c159fdbe-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:16:12.465019 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:12.464976 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" path="/var/lib/kubelet/pods/ddf372af-98f8-41fc-a69c-f432c159fdbe/volumes" Apr 20 20:16:40.335921 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.335883 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549"] Apr 20 20:16:40.336418 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.336402 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" Apr 20 20:16:40.336461 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.336421 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" Apr 20 20:16:40.336503 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.336491 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddf372af-98f8-41fc-a69c-f432c159fdbe" containerName="model-chainer" Apr 20 20:16:40.339910 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.339892 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:40.342139 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.342106 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-e6f34-serving-cert\"" Apr 20 20:16:40.342250 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.342180 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-e6f34-kube-rbac-proxy-sar-config\"" Apr 20 20:16:40.345549 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.345522 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549"] Apr 20 20:16:40.380758 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.380730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ab0dd56-0bd3-4617-952f-bcd860eceee1-openshift-service-ca-bundle\") pod \"sequence-graph-e6f34-55f69b6976-9c549\" (UID: \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\") " pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:40.380913 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.380796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ab0dd56-0bd3-4617-952f-bcd860eceee1-proxy-tls\") pod \"sequence-graph-e6f34-55f69b6976-9c549\" (UID: \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\") " pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:40.481339 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.481304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ab0dd56-0bd3-4617-952f-bcd860eceee1-proxy-tls\") pod \"sequence-graph-e6f34-55f69b6976-9c549\" (UID: \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\") " pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:40.481506 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.481419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ab0dd56-0bd3-4617-952f-bcd860eceee1-openshift-service-ca-bundle\") pod \"sequence-graph-e6f34-55f69b6976-9c549\" (UID: \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\") " pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:40.482007 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.481981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ab0dd56-0bd3-4617-952f-bcd860eceee1-openshift-service-ca-bundle\") pod \"sequence-graph-e6f34-55f69b6976-9c549\" (UID: \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\") " pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:40.483707 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.483689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ab0dd56-0bd3-4617-952f-bcd860eceee1-proxy-tls\") pod \"sequence-graph-e6f34-55f69b6976-9c549\" (UID: \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\") " pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:40.651264 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.651183 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:40.767723 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:40.767695 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549"] Apr 20 20:16:40.770282 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:16:40.770251 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab0dd56_0bd3_4617_952f_bcd860eceee1.slice/crio-552c2659196ebd1b2061cda2e608145916cda29ba73a40b3d7199fa7d37e73c0 WatchSource:0}: Error finding container 552c2659196ebd1b2061cda2e608145916cda29ba73a40b3d7199fa7d37e73c0: Status 404 returned error can't find the container with id 552c2659196ebd1b2061cda2e608145916cda29ba73a40b3d7199fa7d37e73c0 Apr 20 20:16:41.577907 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:41.577863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" event={"ID":"9ab0dd56-0bd3-4617-952f-bcd860eceee1","Type":"ContainerStarted","Data":"6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5"} Apr 20 20:16:41.577907 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:41.577904 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" event={"ID":"9ab0dd56-0bd3-4617-952f-bcd860eceee1","Type":"ContainerStarted","Data":"552c2659196ebd1b2061cda2e608145916cda29ba73a40b3d7199fa7d37e73c0"} Apr 20 20:16:41.578330 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:41.577927 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:16:41.593268 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:41.593125 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" podStartSLOduration=1.593107228 podStartE2EDuration="1.593107228s" podCreationTimestamp="2026-04-20 20:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:16:41.592440671 +0000 UTC m=+683.716133760" watchObservedRunningTime="2026-04-20 20:16:41.593107228 +0000 UTC m=+683.716800316" Apr 20 20:16:47.586589 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:16:47.586557 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:20:18.384978 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:20:18.384943 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:20:18.386231 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:20:18.386201 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:20:18.389670 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:20:18.389649 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:20:18.390730 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:20:18.390711 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:24:15.209810 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:15.209778 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47"] Apr 20 20:24:15.210321 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:15.210017 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" containerID="cri-o://3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d" gracePeriod=30 Apr 20 20:24:18.466523 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:18.466489 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:24:23.468232 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:23.468192 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:24:28.466425 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:28.466346 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:24:28.466821 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:28.466453 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:24:33.466973 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:33.466926 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:24:38.467367 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:38.467329 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:24:43.466280 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:43.466235 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:24:45.360763 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:45.360733 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:24:45.424658 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:45.424628 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls\") pod \"7e06f30b-363e-4c4a-aa80-a89ac2540654\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " Apr 20 20:24:45.424826 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:45.424711 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e06f30b-363e-4c4a-aa80-a89ac2540654-openshift-service-ca-bundle\") pod \"7e06f30b-363e-4c4a-aa80-a89ac2540654\" (UID: \"7e06f30b-363e-4c4a-aa80-a89ac2540654\") " Apr 20 20:24:45.425053 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:45.425005 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e06f30b-363e-4c4a-aa80-a89ac2540654-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7e06f30b-363e-4c4a-aa80-a89ac2540654" (UID: "7e06f30b-363e-4c4a-aa80-a89ac2540654"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:24:45.426766 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:45.426738 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7e06f30b-363e-4c4a-aa80-a89ac2540654" (UID: "7e06f30b-363e-4c4a-aa80-a89ac2540654"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:24:45.525700 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:45.525674 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e06f30b-363e-4c4a-aa80-a89ac2540654-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:24:45.525700 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:45.525701 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e06f30b-363e-4c4a-aa80-a89ac2540654-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:24:46.014369 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.014290 2574 generic.go:358] "Generic (PLEG): container finished" podID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerID="3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d" exitCode=0 Apr 20 20:24:46.014369 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.014354 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" Apr 20 20:24:46.014637 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.014380 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" event={"ID":"7e06f30b-363e-4c4a-aa80-a89ac2540654","Type":"ContainerDied","Data":"3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d"} Apr 20 20:24:46.014637 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.014419 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47" event={"ID":"7e06f30b-363e-4c4a-aa80-a89ac2540654","Type":"ContainerDied","Data":"b084e03bf3ffb845c9898c339c56139565080ce13137fabb55a086238a7b7a60"} Apr 20 20:24:46.014637 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.014435 2574 scope.go:117] "RemoveContainer" containerID="3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d" Apr 20 20:24:46.023178 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.023162 2574 scope.go:117] "RemoveContainer" containerID="3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d" Apr 20 20:24:46.023422 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:24:46.023408 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d\": container with ID starting with 3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d not found: ID does not exist" containerID="3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d" Apr 20 20:24:46.023460 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.023430 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d"} err="failed to get container status \"3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d\": rpc error: code = NotFound desc = could not find container \"3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d\": container with ID starting with 3eb1ef3ffb7d197512277a39328900ceac98decb84bb5a3149544701bcc0546d not found: ID does not exist" Apr 20 20:24:46.034426 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.034406 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47"] Apr 20 20:24:46.037543 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.037523 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9290e-8cc455c84-hqz47"] Apr 20 20:24:46.465191 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:46.465159 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" path="/var/lib/kubelet/pods/7e06f30b-363e-4c4a-aa80-a89ac2540654/volumes" Apr 20 20:24:54.990419 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:54.990386 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549"] Apr 20 20:24:54.990803 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:54.990613 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" containerID="cri-o://6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5" gracePeriod=30 Apr 20 20:24:57.585943 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:24:57.585899 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:02.584885 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:02.584843 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:07.585600 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:07.585554 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:07.586093 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:07.585661 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:25:12.585672 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:12.585636 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:15.446075 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.446019 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph"] Apr 20 20:25:15.446440 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.446409 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" Apr 20 20:25:15.446440 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.446422 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" Apr 20 20:25:15.446510 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.446489 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e06f30b-363e-4c4a-aa80-a89ac2540654" containerName="switch-graph-9290e" Apr 20 20:25:15.449348 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.449331 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:15.451649 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.451628 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-7541f-serving-cert\"" Apr 20 20:25:15.451798 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.451683 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-7541f-kube-rbac-proxy-sar-config\"" Apr 20 20:25:15.459151 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.459125 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph"] Apr 20 20:25:15.475679 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.475649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls\") pod \"ensemble-graph-7541f-6b5b9965d4-lj2ph\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:15.475817 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.475699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-openshift-service-ca-bundle\") pod \"ensemble-graph-7541f-6b5b9965d4-lj2ph\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:15.577003 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.576971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-openshift-service-ca-bundle\") pod \"ensemble-graph-7541f-6b5b9965d4-lj2ph\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:15.577202 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.577093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls\") pod \"ensemble-graph-7541f-6b5b9965d4-lj2ph\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:15.577265 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:25:15.577248 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-7541f-serving-cert: secret "ensemble-graph-7541f-serving-cert" not found Apr 20 20:25:15.577333 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:25:15.577321 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls podName:381ce4b9-2e11-4621-b3fa-65b4ff7e895f nodeName:}" failed. No retries permitted until 2026-04-20 20:25:16.07730502 +0000 UTC m=+1198.200998092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls") pod "ensemble-graph-7541f-6b5b9965d4-lj2ph" (UID: "381ce4b9-2e11-4621-b3fa-65b4ff7e895f") : secret "ensemble-graph-7541f-serving-cert" not found Apr 20 20:25:15.577648 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:15.577629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-openshift-service-ca-bundle\") pod \"ensemble-graph-7541f-6b5b9965d4-lj2ph\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:16.080555 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:16.080507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls\") pod \"ensemble-graph-7541f-6b5b9965d4-lj2ph\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:16.083014 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:16.082994 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls\") pod \"ensemble-graph-7541f-6b5b9965d4-lj2ph\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:16.360943 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:16.360848 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:16.480351 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:16.480329 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph"] Apr 20 20:25:16.483300 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:25:16.483270 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod381ce4b9_2e11_4621_b3fa_65b4ff7e895f.slice/crio-d43f13cb4020f1b07c81c40a2767199daed35962b58b79ac377cef66a449e0b6 WatchSource:0}: Error finding container d43f13cb4020f1b07c81c40a2767199daed35962b58b79ac377cef66a449e0b6: Status 404 returned error can't find the container with id d43f13cb4020f1b07c81c40a2767199daed35962b58b79ac377cef66a449e0b6 Apr 20 20:25:16.485367 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:16.485351 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:25:17.116333 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:17.116289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" event={"ID":"381ce4b9-2e11-4621-b3fa-65b4ff7e895f","Type":"ContainerStarted","Data":"6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3"} Apr 20 20:25:17.116333 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:17.116327 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" event={"ID":"381ce4b9-2e11-4621-b3fa-65b4ff7e895f","Type":"ContainerStarted","Data":"d43f13cb4020f1b07c81c40a2767199daed35962b58b79ac377cef66a449e0b6"} Apr 20 20:25:17.116660 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:17.116363 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:17.133392 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:17.133344 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" podStartSLOduration=2.133332029 podStartE2EDuration="2.133332029s" podCreationTimestamp="2026-04-20 20:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:25:17.131589147 +0000 UTC m=+1199.255282229" watchObservedRunningTime="2026-04-20 20:25:17.133332029 +0000 UTC m=+1199.257025116" Apr 20 20:25:17.584995 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:17.584952 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:18.407448 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:18.407412 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:25:18.409208 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:18.409186 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:25:18.412160 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:18.412136 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:25:18.413970 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:18.413943 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:25:22.585434 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:22.585393 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:23.125948 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:23.125920 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:25.132364 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.132335 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:25:25.143260 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.143232 2574 generic.go:358] "Generic (PLEG): container finished" podID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerID="6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5" exitCode=0 Apr 20 20:25:25.143386 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.143293 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" Apr 20 20:25:25.143386 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.143305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" event={"ID":"9ab0dd56-0bd3-4617-952f-bcd860eceee1","Type":"ContainerDied","Data":"6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5"} Apr 20 20:25:25.143386 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.143347 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549" event={"ID":"9ab0dd56-0bd3-4617-952f-bcd860eceee1","Type":"ContainerDied","Data":"552c2659196ebd1b2061cda2e608145916cda29ba73a40b3d7199fa7d37e73c0"} Apr 20 20:25:25.143386 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.143364 2574 scope.go:117] "RemoveContainer" containerID="6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5" Apr 20 20:25:25.151814 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.151795 2574 scope.go:117] "RemoveContainer" containerID="6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5" Apr 20 20:25:25.152396 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:25:25.152369 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5\": container with ID starting with 6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5 not found: ID does not exist" containerID="6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5" Apr 20 20:25:25.152501 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.152401 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5"} err="failed to get container status \"6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5\": rpc error: code = NotFound desc = could not find container \"6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5\": container with ID starting with 6554dbd2f74495720c95f6fb5366810c3eb332659e51c899dc2feb081e63a3b5 not found: ID does not exist" Apr 20 20:25:25.154672 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.154651 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ab0dd56-0bd3-4617-952f-bcd860eceee1-proxy-tls\") pod \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\" (UID: \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\") " Apr 20 20:25:25.154764 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.154687 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ab0dd56-0bd3-4617-952f-bcd860eceee1-openshift-service-ca-bundle\") pod \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\" (UID: \"9ab0dd56-0bd3-4617-952f-bcd860eceee1\") " Apr 20 20:25:25.155114 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.155094 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab0dd56-0bd3-4617-952f-bcd860eceee1-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9ab0dd56-0bd3-4617-952f-bcd860eceee1" (UID: "9ab0dd56-0bd3-4617-952f-bcd860eceee1"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:25.156837 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.156817 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab0dd56-0bd3-4617-952f-bcd860eceee1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ab0dd56-0bd3-4617-952f-bcd860eceee1" (UID: "9ab0dd56-0bd3-4617-952f-bcd860eceee1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:25:25.255562 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.255504 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ab0dd56-0bd3-4617-952f-bcd860eceee1-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:25:25.255562 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.255530 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ab0dd56-0bd3-4617-952f-bcd860eceee1-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:25:25.464257 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.464222 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549"] Apr 20 20:25:25.468542 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.468512 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e6f34-55f69b6976-9c549"] Apr 20 20:25:25.543480 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.543449 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph"] Apr 20 20:25:25.543721 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:25.543700 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" containerID="cri-o://6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3" gracePeriod=30 Apr 20 20:25:26.465253 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:26.465220 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" path="/var/lib/kubelet/pods/9ab0dd56-0bd3-4617-952f-bcd860eceee1/volumes" Apr 20 20:25:28.123595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:28.123555 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:33.123733 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:33.123692 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:38.124316 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:38.124276 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:38.124682 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:38.124374 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:43.125369 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:43.125325 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:48.124398 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:48.124356 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:53.123903 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:53.123865 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:25:55.186896 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.186861 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2"] Apr 20 20:25:55.187365 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.187211 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" Apr 20 20:25:55.187365 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.187224 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" Apr 20 20:25:55.187365 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.187286 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ab0dd56-0bd3-4617-952f-bcd860eceee1" containerName="sequence-graph-e6f34" Apr 20 20:25:55.190187 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.190170 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:55.192431 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.192409 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-63de8-serving-cert\"" Apr 20 20:25:55.192523 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.192474 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-63de8-kube-rbac-proxy-sar-config\"" Apr 20 20:25:55.197642 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.197266 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2"] Apr 20 20:25:55.317468 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.317430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls\") pod \"sequence-graph-63de8-85847f4f4d-lglk2\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:55.317649 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.317486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-openshift-service-ca-bundle\") pod \"sequence-graph-63de8-85847f4f4d-lglk2\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:55.418265 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.418225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls\") pod \"sequence-graph-63de8-85847f4f4d-lglk2\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:55.418437 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.418312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-openshift-service-ca-bundle\") pod \"sequence-graph-63de8-85847f4f4d-lglk2\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:55.418437 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:25:55.418398 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-63de8-serving-cert: secret "sequence-graph-63de8-serving-cert" not found Apr 20 20:25:55.418549 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:25:55.418461 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls podName:526ea70f-30fa-4e11-9f6f-f13da9aef4e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:25:55.918446416 +0000 UTC m=+1238.042139481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls") pod "sequence-graph-63de8-85847f4f4d-lglk2" (UID: "526ea70f-30fa-4e11-9f6f-f13da9aef4e5") : secret "sequence-graph-63de8-serving-cert" not found Apr 20 20:25:55.418904 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.418886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-openshift-service-ca-bundle\") pod \"sequence-graph-63de8-85847f4f4d-lglk2\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:55.686794 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.686767 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:55.823394 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.822519 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls\") pod \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " Apr 20 20:25:55.823394 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.822664 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-openshift-service-ca-bundle\") pod \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\" (UID: \"381ce4b9-2e11-4621-b3fa-65b4ff7e895f\") " Apr 20 20:25:55.824073 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.823730 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "381ce4b9-2e11-4621-b3fa-65b4ff7e895f" (UID: "381ce4b9-2e11-4621-b3fa-65b4ff7e895f"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:55.827888 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.826528 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "381ce4b9-2e11-4621-b3fa-65b4ff7e895f" (UID: "381ce4b9-2e11-4621-b3fa-65b4ff7e895f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:25:55.923767 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.923696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls\") pod \"sequence-graph-63de8-85847f4f4d-lglk2\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:55.923925 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.923908 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:25:55.923967 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.923928 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/381ce4b9-2e11-4621-b3fa-65b4ff7e895f-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:25:55.926279 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:55.926253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls\") pod \"sequence-graph-63de8-85847f4f4d-lglk2\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:56.101620 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.101546 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:56.221111 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.220925 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2"] Apr 20 20:25:56.223430 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:25:56.223404 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526ea70f_30fa_4e11_9f6f_f13da9aef4e5.slice/crio-12681e905381863189d8a3b8eea9987d01af63aa299ea0d0c9e9af9b5679067e WatchSource:0}: Error finding container 12681e905381863189d8a3b8eea9987d01af63aa299ea0d0c9e9af9b5679067e: Status 404 returned error can't find the container with id 12681e905381863189d8a3b8eea9987d01af63aa299ea0d0c9e9af9b5679067e Apr 20 20:25:56.233196 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.233167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" event={"ID":"526ea70f-30fa-4e11-9f6f-f13da9aef4e5","Type":"ContainerStarted","Data":"12681e905381863189d8a3b8eea9987d01af63aa299ea0d0c9e9af9b5679067e"} Apr 20 20:25:56.234212 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.234186 2574 generic.go:358] "Generic (PLEG): container finished" podID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerID="6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3" exitCode=0 Apr 20 20:25:56.234317 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.234217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" event={"ID":"381ce4b9-2e11-4621-b3fa-65b4ff7e895f","Type":"ContainerDied","Data":"6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3"} Apr 20 20:25:56.234317 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.234235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" event={"ID":"381ce4b9-2e11-4621-b3fa-65b4ff7e895f","Type":"ContainerDied","Data":"d43f13cb4020f1b07c81c40a2767199daed35962b58b79ac377cef66a449e0b6"} Apr 20 20:25:56.234317 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.234249 2574 scope.go:117] "RemoveContainer" containerID="6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3" Apr 20 20:25:56.234317 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.234270 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph" Apr 20 20:25:56.244128 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.244105 2574 scope.go:117] "RemoveContainer" containerID="6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3" Apr 20 20:25:56.244430 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:25:56.244408 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3\": container with ID starting with 6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3 not found: ID does not exist" containerID="6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3" Apr 20 20:25:56.244504 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.244441 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3"} err="failed to get container status \"6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3\": rpc error: code = NotFound desc = could not find container \"6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3\": container with ID starting with 6ea99bd93f089f0571b05387e3afa5f3712400cf21dc2cff0816ed115184baa3 not found: ID does not exist" Apr 20 20:25:56.256198 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.256163 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph"] Apr 20 20:25:56.260480 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.260343 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-7541f-6b5b9965d4-lj2ph"] Apr 20 20:25:56.466132 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:56.466056 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" path="/var/lib/kubelet/pods/381ce4b9-2e11-4621-b3fa-65b4ff7e895f/volumes" Apr 20 20:25:57.239947 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:57.239869 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" event={"ID":"526ea70f-30fa-4e11-9f6f-f13da9aef4e5","Type":"ContainerStarted","Data":"a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b"} Apr 20 20:25:57.240330 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:57.239978 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:25:57.255268 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:25:57.255205 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" podStartSLOduration=2.255191315 podStartE2EDuration="2.255191315s" podCreationTimestamp="2026-04-20 20:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:25:57.254731403 +0000 UTC m=+1239.378424493" watchObservedRunningTime="2026-04-20 20:25:57.255191315 +0000 UTC m=+1239.378884404" Apr 20 20:26:03.249733 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:03.249702 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:26:05.277149 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:05.277115 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2"] Apr 20 20:26:05.277520 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:05.277329 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" containerID="cri-o://a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b" gracePeriod=30 Apr 20 20:26:08.248175 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:08.248135 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:26:13.247841 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:13.247795 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:26:18.247196 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:18.247146 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:26:18.247601 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:18.247262 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:26:23.248200 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:23.248161 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:26:25.754976 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.754932 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv"] Apr 20 20:26:25.755333 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.755302 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" Apr 20 20:26:25.755333 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.755316 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" Apr 20 20:26:25.755420 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.755410 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="381ce4b9-2e11-4621-b3fa-65b4ff7e895f" containerName="ensemble-graph-7541f" Apr 20 20:26:25.758170 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.758151 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:25.760464 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.760435 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-38d71-serving-cert\"" Apr 20 20:26:25.760567 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.760448 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-38d71-kube-rbac-proxy-sar-config\"" Apr 20 20:26:25.766018 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.765993 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv"] Apr 20 20:26:25.858203 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.858170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls\") pod \"ensemble-graph-38d71-7dbc85bf6b-hh2kv\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:25.858335 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.858217 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b964d357-86f9-44af-b9b8-8b19415ff4c8-openshift-service-ca-bundle\") pod \"ensemble-graph-38d71-7dbc85bf6b-hh2kv\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:25.959729 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.959690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b964d357-86f9-44af-b9b8-8b19415ff4c8-openshift-service-ca-bundle\") pod \"ensemble-graph-38d71-7dbc85bf6b-hh2kv\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:25.959875 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.959805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls\") pod \"ensemble-graph-38d71-7dbc85bf6b-hh2kv\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:25.959932 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:26:25.959917 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-38d71-serving-cert: secret "ensemble-graph-38d71-serving-cert" not found Apr 20 20:26:25.959997 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:26:25.959987 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls podName:b964d357-86f9-44af-b9b8-8b19415ff4c8 nodeName:}" failed. No retries permitted until 2026-04-20 20:26:26.459966291 +0000 UTC m=+1268.583659363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls") pod "ensemble-graph-38d71-7dbc85bf6b-hh2kv" (UID: "b964d357-86f9-44af-b9b8-8b19415ff4c8") : secret "ensemble-graph-38d71-serving-cert" not found Apr 20 20:26:25.960372 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:25.960354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b964d357-86f9-44af-b9b8-8b19415ff4c8-openshift-service-ca-bundle\") pod \"ensemble-graph-38d71-7dbc85bf6b-hh2kv\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:26.465059 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:26.465002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls\") pod \"ensemble-graph-38d71-7dbc85bf6b-hh2kv\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:26.465224 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:26:26.465165 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-38d71-serving-cert: secret "ensemble-graph-38d71-serving-cert" not found Apr 20 20:26:26.465273 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:26:26.465248 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls podName:b964d357-86f9-44af-b9b8-8b19415ff4c8 nodeName:}" failed. No retries permitted until 2026-04-20 20:26:27.465228407 +0000 UTC m=+1269.588921496 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls") pod "ensemble-graph-38d71-7dbc85bf6b-hh2kv" (UID: "b964d357-86f9-44af-b9b8-8b19415ff4c8") : secret "ensemble-graph-38d71-serving-cert" not found Apr 20 20:26:27.473314 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:27.473276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls\") pod \"ensemble-graph-38d71-7dbc85bf6b-hh2kv\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:27.475795 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:27.475769 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls\") pod \"ensemble-graph-38d71-7dbc85bf6b-hh2kv\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:27.568850 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:27.568811 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:27.692670 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:27.692644 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv"] Apr 20 20:26:28.247788 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:28.247752 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:26:28.333988 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:28.333948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" event={"ID":"b964d357-86f9-44af-b9b8-8b19415ff4c8","Type":"ContainerStarted","Data":"b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d"} Apr 20 20:26:28.333988 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:28.333991 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" event={"ID":"b964d357-86f9-44af-b9b8-8b19415ff4c8","Type":"ContainerStarted","Data":"1a5d4e8c60de3a8b903d210485ba2023b4d844d6ab702bb64b4e08568811cdf0"} Apr 20 20:26:28.334225 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:28.334086 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:28.349250 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:28.349212 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" podStartSLOduration=3.34919949 podStartE2EDuration="3.34919949s" podCreationTimestamp="2026-04-20 20:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:26:28.347763434 +0000 UTC m=+1270.471456521" watchObservedRunningTime="2026-04-20 20:26:28.34919949 +0000 UTC m=+1270.472892580" Apr 20 20:26:33.247702 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:33.247658 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:26:34.343927 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:34.343896 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:26:35.302528 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:26:35.302494 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526ea70f_30fa_4e11_9f6f_f13da9aef4e5.slice/crio-conmon-a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526ea70f_30fa_4e11_9f6f_f13da9aef4e5.slice/crio-a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:26:35.302663 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:26:35.302554 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526ea70f_30fa_4e11_9f6f_f13da9aef4e5.slice/crio-a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526ea70f_30fa_4e11_9f6f_f13da9aef4e5.slice/crio-conmon-a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:26:35.357443 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.357414 2574 generic.go:358] "Generic (PLEG): container finished" podID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerID="a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b" exitCode=0 Apr 20 20:26:35.357776 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.357486 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" event={"ID":"526ea70f-30fa-4e11-9f6f-f13da9aef4e5","Type":"ContainerDied","Data":"a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b"} Apr 20 20:26:35.417171 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.417145 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:26:35.539418 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.539346 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls\") pod \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " Apr 20 20:26:35.539418 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.539400 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-openshift-service-ca-bundle\") pod \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\" (UID: \"526ea70f-30fa-4e11-9f6f-f13da9aef4e5\") " Apr 20 20:26:35.539765 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.539743 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "526ea70f-30fa-4e11-9f6f-f13da9aef4e5" (UID: "526ea70f-30fa-4e11-9f6f-f13da9aef4e5"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:26:35.541522 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.541502 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "526ea70f-30fa-4e11-9f6f-f13da9aef4e5" (UID: "526ea70f-30fa-4e11-9f6f-f13da9aef4e5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:26:35.640742 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.640707 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:26:35.640742 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:35.640737 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/526ea70f-30fa-4e11-9f6f-f13da9aef4e5-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:26:36.361731 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:36.361700 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" event={"ID":"526ea70f-30fa-4e11-9f6f-f13da9aef4e5","Type":"ContainerDied","Data":"12681e905381863189d8a3b8eea9987d01af63aa299ea0d0c9e9af9b5679067e"} Apr 20 20:26:36.362168 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:36.361706 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2" Apr 20 20:26:36.362168 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:36.361749 2574 scope.go:117] "RemoveContainer" containerID="a742d688c6e4f2658af5d44fe0076b98733b29ac274482d99466898d17586d0b" Apr 20 20:26:36.385713 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:36.385686 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2"] Apr 20 20:26:36.389009 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:36.388982 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-63de8-85847f4f4d-lglk2"] Apr 20 20:26:36.465289 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:26:36.465251 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" path="/var/lib/kubelet/pods/526ea70f-30fa-4e11-9f6f-f13da9aef4e5/volumes" Apr 20 20:27:05.492123 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.492090 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq"] Apr 20 20:27:05.492600 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.492585 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" Apr 20 20:27:05.492644 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.492603 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" Apr 20 20:27:05.492711 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.492700 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="526ea70f-30fa-4e11-9f6f-f13da9aef4e5" containerName="sequence-graph-63de8" Apr 20 20:27:05.495874 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.495857 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:05.498272 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.498241 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-fe410-kube-rbac-proxy-sar-config\"" Apr 20 20:27:05.498272 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.498269 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-fe410-serving-cert\"" Apr 20 20:27:05.504928 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.504905 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq"] Apr 20 20:27:05.585860 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.585833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a518fbed-cd34-47b8-8cfd-82217f9c49e2-proxy-tls\") pod \"sequence-graph-fe410-7759986577-6jbfq\" (UID: \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\") " pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:05.585986 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.585868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a518fbed-cd34-47b8-8cfd-82217f9c49e2-openshift-service-ca-bundle\") pod \"sequence-graph-fe410-7759986577-6jbfq\" (UID: \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\") " pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:05.687016 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.686985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a518fbed-cd34-47b8-8cfd-82217f9c49e2-proxy-tls\") pod \"sequence-graph-fe410-7759986577-6jbfq\" (UID: \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\") " pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:05.687180 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.687071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a518fbed-cd34-47b8-8cfd-82217f9c49e2-openshift-service-ca-bundle\") pod \"sequence-graph-fe410-7759986577-6jbfq\" (UID: \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\") " pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:05.687745 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.687723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a518fbed-cd34-47b8-8cfd-82217f9c49e2-openshift-service-ca-bundle\") pod \"sequence-graph-fe410-7759986577-6jbfq\" (UID: \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\") " pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:05.689681 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.689660 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a518fbed-cd34-47b8-8cfd-82217f9c49e2-proxy-tls\") pod \"sequence-graph-fe410-7759986577-6jbfq\" (UID: \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\") " pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:05.806356 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.806309 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:05.923562 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:05.923537 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq"] Apr 20 20:27:05.925657 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:27:05.925625 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda518fbed_cd34_47b8_8cfd_82217f9c49e2.slice/crio-9fee85c14dc73f0167700ebbc3ed325d503849b4e0894aec44f852d6b8a450f4 WatchSource:0}: Error finding container 9fee85c14dc73f0167700ebbc3ed325d503849b4e0894aec44f852d6b8a450f4: Status 404 returned error can't find the container with id 9fee85c14dc73f0167700ebbc3ed325d503849b4e0894aec44f852d6b8a450f4 Apr 20 20:27:06.452413 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:06.452378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" event={"ID":"a518fbed-cd34-47b8-8cfd-82217f9c49e2","Type":"ContainerStarted","Data":"5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d"} Apr 20 20:27:06.452413 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:06.452413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" event={"ID":"a518fbed-cd34-47b8-8cfd-82217f9c49e2","Type":"ContainerStarted","Data":"9fee85c14dc73f0167700ebbc3ed325d503849b4e0894aec44f852d6b8a450f4"} Apr 20 20:27:06.452622 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:06.452438 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:27:06.468904 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:06.468852 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" podStartSLOduration=1.468836988 podStartE2EDuration="1.468836988s" podCreationTimestamp="2026-04-20 20:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:27:06.467058244 +0000 UTC m=+1308.590751322" watchObservedRunningTime="2026-04-20 20:27:06.468836988 +0000 UTC m=+1308.592530075" Apr 20 20:27:12.464712 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:27:12.464686 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:30:18.440346 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:30:18.440311 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:30:18.445463 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:30:18.445438 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:30:18.446337 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:30:18.446317 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:30:18.450104 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:30:18.450086 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:34:40.353298 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:34:40.353259 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv"] Apr 20 20:34:40.355621 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:34:40.353581 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" containerID="cri-o://b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d" gracePeriod=30 Apr 20 20:34:44.341485 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:34:44.341445 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:34:49.342551 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:34:49.342510 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:34:54.341549 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:34:54.341508 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:34:54.341933 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:34:54.341613 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:34:59.341559 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:34:59.341472 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:04.341422 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:04.341377 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:09.341698 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:09.341660 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:10.493920 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.493896 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:35:10.678549 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.678456 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b964d357-86f9-44af-b9b8-8b19415ff4c8-openshift-service-ca-bundle\") pod \"b964d357-86f9-44af-b9b8-8b19415ff4c8\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " Apr 20 20:35:10.678715 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.678591 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls\") pod \"b964d357-86f9-44af-b9b8-8b19415ff4c8\" (UID: \"b964d357-86f9-44af-b9b8-8b19415ff4c8\") " Apr 20 20:35:10.678805 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.678780 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b964d357-86f9-44af-b9b8-8b19415ff4c8-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b964d357-86f9-44af-b9b8-8b19415ff4c8" (UID: "b964d357-86f9-44af-b9b8-8b19415ff4c8"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:35:10.680847 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.680823 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b964d357-86f9-44af-b9b8-8b19415ff4c8" (UID: "b964d357-86f9-44af-b9b8-8b19415ff4c8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:35:10.779708 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.779672 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b964d357-86f9-44af-b9b8-8b19415ff4c8-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:35:10.779708 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.779706 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b964d357-86f9-44af-b9b8-8b19415ff4c8-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:35:10.863842 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.863803 2574 generic.go:358] "Generic (PLEG): container finished" podID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerID="b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d" exitCode=0 Apr 20 20:35:10.863982 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.863866 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" Apr 20 20:35:10.863982 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.863899 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" event={"ID":"b964d357-86f9-44af-b9b8-8b19415ff4c8","Type":"ContainerDied","Data":"b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d"} Apr 20 20:35:10.863982 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.863939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv" event={"ID":"b964d357-86f9-44af-b9b8-8b19415ff4c8","Type":"ContainerDied","Data":"1a5d4e8c60de3a8b903d210485ba2023b4d844d6ab702bb64b4e08568811cdf0"} Apr 20 20:35:10.863982 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.863956 2574 scope.go:117] "RemoveContainer" containerID="b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d" Apr 20 20:35:10.872169 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.872150 2574 scope.go:117] "RemoveContainer" containerID="b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d" Apr 20 20:35:10.872444 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:35:10.872416 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d\": container with ID starting with b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d not found: ID does not exist" containerID="b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d" Apr 20 20:35:10.872491 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.872457 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d"} err="failed to get container status \"b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d\": rpc error: code = NotFound desc = could not find container \"b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d\": container with ID starting with b123318ffe86dfa9d99fae214ca64a57c7370c3932b5f0939a81dadcddbe159d not found: ID does not exist" Apr 20 20:35:10.882975 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.882940 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv"] Apr 20 20:35:10.888062 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:10.888020 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-38d71-7dbc85bf6b-hh2kv"] Apr 20 20:35:12.465501 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:12.465467 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" path="/var/lib/kubelet/pods/b964d357-86f9-44af-b9b8-8b19415ff4c8/volumes" Apr 20 20:35:18.464779 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:18.464747 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:35:18.469703 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:18.469680 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:35:18.470283 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:18.470264 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:35:18.476952 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:18.476934 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:35:20.193346 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:20.193309 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq"] Apr 20 20:35:20.193699 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:20.193562 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" containerID="cri-o://5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d" gracePeriod=30 Apr 20 20:35:22.459171 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:22.459125 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:27.459289 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:27.459244 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:32.460022 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:32.459984 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:32.460437 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:32.460113 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:35:37.459605 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:37.459560 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:40.598641 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.598606 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5"] Apr 20 20:35:40.599201 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.599131 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" Apr 20 20:35:40.599201 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.599153 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" Apr 20 20:35:40.599382 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.599245 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b964d357-86f9-44af-b9b8-8b19415ff4c8" containerName="ensemble-graph-38d71" Apr 20 20:35:40.602118 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.602097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:40.604245 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.604222 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-ab6b8-kube-rbac-proxy-sar-config\"" Apr 20 20:35:40.604377 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.604226 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-ab6b8-serving-cert\"" Apr 20 20:35:40.611146 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.611120 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5"] Apr 20 20:35:40.732079 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.732016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e972aafe-9982-429a-8ec8-39b81b77fc55-openshift-service-ca-bundle\") pod \"splitter-graph-ab6b8-54bf769bd5-jnmp5\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:40.732247 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.732216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls\") pod \"splitter-graph-ab6b8-54bf769bd5-jnmp5\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:40.832796 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.832757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls\") pod \"splitter-graph-ab6b8-54bf769bd5-jnmp5\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:40.832951 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.832819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e972aafe-9982-429a-8ec8-39b81b77fc55-openshift-service-ca-bundle\") pod \"splitter-graph-ab6b8-54bf769bd5-jnmp5\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:40.832951 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:35:40.832891 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-ab6b8-serving-cert: secret "splitter-graph-ab6b8-serving-cert" not found Apr 20 20:35:40.833027 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:35:40.832965 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls podName:e972aafe-9982-429a-8ec8-39b81b77fc55 nodeName:}" failed. No retries permitted until 2026-04-20 20:35:41.332948023 +0000 UTC m=+1823.456641089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls") pod "splitter-graph-ab6b8-54bf769bd5-jnmp5" (UID: "e972aafe-9982-429a-8ec8-39b81b77fc55") : secret "splitter-graph-ab6b8-serving-cert" not found Apr 20 20:35:40.833454 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:40.833436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e972aafe-9982-429a-8ec8-39b81b77fc55-openshift-service-ca-bundle\") pod \"splitter-graph-ab6b8-54bf769bd5-jnmp5\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:41.336142 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.336088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls\") pod \"splitter-graph-ab6b8-54bf769bd5-jnmp5\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:41.338784 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.338755 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls\") pod \"splitter-graph-ab6b8-54bf769bd5-jnmp5\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:41.513140 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.513102 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:41.633998 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.633948 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5"] Apr 20 20:35:41.636809 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:35:41.636773 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode972aafe_9982_429a_8ec8_39b81b77fc55.slice/crio-471459171cd5731dd71eb9554d834810b022f855884f44d48d8660b34bead68c WatchSource:0}: Error finding container 471459171cd5731dd71eb9554d834810b022f855884f44d48d8660b34bead68c: Status 404 returned error can't find the container with id 471459171cd5731dd71eb9554d834810b022f855884f44d48d8660b34bead68c Apr 20 20:35:41.638878 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.638859 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:35:41.952804 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.952729 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" event={"ID":"e972aafe-9982-429a-8ec8-39b81b77fc55","Type":"ContainerStarted","Data":"7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3"} Apr 20 20:35:41.952804 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.952762 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" event={"ID":"e972aafe-9982-429a-8ec8-39b81b77fc55","Type":"ContainerStarted","Data":"471459171cd5731dd71eb9554d834810b022f855884f44d48d8660b34bead68c"} Apr 20 20:35:41.952996 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.952908 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:41.969837 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:41.969779 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" podStartSLOduration=1.969762926 podStartE2EDuration="1.969762926s" podCreationTimestamp="2026-04-20 20:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:35:41.968270499 +0000 UTC m=+1824.091963588" watchObservedRunningTime="2026-04-20 20:35:41.969762926 +0000 UTC m=+1824.093456014" Apr 20 20:35:42.459772 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:42.459735 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:47.459575 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:47.459536 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:47.961990 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:47.961959 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:35:50.335592 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.335568 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:35:50.412300 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.412272 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a518fbed-cd34-47b8-8cfd-82217f9c49e2-proxy-tls\") pod \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\" (UID: \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\") " Apr 20 20:35:50.412449 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.412317 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a518fbed-cd34-47b8-8cfd-82217f9c49e2-openshift-service-ca-bundle\") pod \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\" (UID: \"a518fbed-cd34-47b8-8cfd-82217f9c49e2\") " Apr 20 20:35:50.412696 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.412671 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a518fbed-cd34-47b8-8cfd-82217f9c49e2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "a518fbed-cd34-47b8-8cfd-82217f9c49e2" (UID: "a518fbed-cd34-47b8-8cfd-82217f9c49e2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:35:50.414499 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.414476 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a518fbed-cd34-47b8-8cfd-82217f9c49e2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a518fbed-cd34-47b8-8cfd-82217f9c49e2" (UID: "a518fbed-cd34-47b8-8cfd-82217f9c49e2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:35:50.513556 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.513529 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a518fbed-cd34-47b8-8cfd-82217f9c49e2-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:35:50.513556 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.513553 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a518fbed-cd34-47b8-8cfd-82217f9c49e2-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:35:50.680054 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.680006 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5"] Apr 20 20:35:50.680301 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.680274 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" containerID="cri-o://7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3" gracePeriod=30 Apr 20 20:35:50.979071 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.979013 2574 generic.go:358] "Generic (PLEG): container finished" podID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerID="5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d" exitCode=0 Apr 20 20:35:50.979266 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.979085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" event={"ID":"a518fbed-cd34-47b8-8cfd-82217f9c49e2","Type":"ContainerDied","Data":"5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d"} Apr 20 20:35:50.979266 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.979113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" event={"ID":"a518fbed-cd34-47b8-8cfd-82217f9c49e2","Type":"ContainerDied","Data":"9fee85c14dc73f0167700ebbc3ed325d503849b4e0894aec44f852d6b8a450f4"} Apr 20 20:35:50.979266 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.979114 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq" Apr 20 20:35:50.979266 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.979128 2574 scope.go:117] "RemoveContainer" containerID="5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d" Apr 20 20:35:50.987678 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.987657 2574 scope.go:117] "RemoveContainer" containerID="5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d" Apr 20 20:35:50.987943 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:35:50.987917 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d\": container with ID starting with 5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d not found: ID does not exist" containerID="5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d" Apr 20 20:35:50.988058 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.987952 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d"} err="failed to get container status \"5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d\": rpc error: code = NotFound desc = could not find container \"5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d\": container with ID starting with 5bbea59b138f817ac2311bc718990246df1780f85c5eb7349fcbc17c70ee312d not found: ID does not exist" Apr 20 20:35:50.997924 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:50.997900 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq"] Apr 20 20:35:51.001331 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:51.001307 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fe410-7759986577-6jbfq"] Apr 20 20:35:52.464853 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:52.464817 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" path="/var/lib/kubelet/pods/a518fbed-cd34-47b8-8cfd-82217f9c49e2/volumes" Apr 20 20:35:52.960770 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:52.960731 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:35:57.960074 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:35:57.960021 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:36:02.960386 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:02.960344 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:36:02.960749 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:02.960455 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:36:07.965293 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:07.961228 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:36:12.960673 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:12.960626 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:36:17.960721 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:17.960679 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:36:20.400893 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.400855 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4"] Apr 20 20:36:20.401356 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.401340 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" Apr 20 20:36:20.401398 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.401359 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" Apr 20 20:36:20.401472 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.401461 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a518fbed-cd34-47b8-8cfd-82217f9c49e2" containerName="sequence-graph-fe410" Apr 20 20:36:20.405673 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.405657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:20.407745 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.407724 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-60a5c-serving-cert\"" Apr 20 20:36:20.407854 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.407728 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-60a5c-kube-rbac-proxy-sar-config\"" Apr 20 20:36:20.410663 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.410638 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4"] Apr 20 20:36:20.463399 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.463369 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fe3de8b-9f73-4927-8756-d868d7db6842-proxy-tls\") pod \"switch-graph-60a5c-64b647654d-2knr4\" (UID: \"2fe3de8b-9f73-4927-8756-d868d7db6842\") " pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:20.463546 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.463408 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe3de8b-9f73-4927-8756-d868d7db6842-openshift-service-ca-bundle\") pod \"switch-graph-60a5c-64b647654d-2knr4\" (UID: \"2fe3de8b-9f73-4927-8756-d868d7db6842\") " pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:20.564119 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.564086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fe3de8b-9f73-4927-8756-d868d7db6842-proxy-tls\") pod \"switch-graph-60a5c-64b647654d-2knr4\" (UID: \"2fe3de8b-9f73-4927-8756-d868d7db6842\") " pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:20.564274 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.564125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe3de8b-9f73-4927-8756-d868d7db6842-openshift-service-ca-bundle\") pod \"switch-graph-60a5c-64b647654d-2knr4\" (UID: \"2fe3de8b-9f73-4927-8756-d868d7db6842\") " pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:20.564843 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.564819 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe3de8b-9f73-4927-8756-d868d7db6842-openshift-service-ca-bundle\") pod \"switch-graph-60a5c-64b647654d-2knr4\" (UID: \"2fe3de8b-9f73-4927-8756-d868d7db6842\") " pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:20.566613 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.566594 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fe3de8b-9f73-4927-8756-d868d7db6842-proxy-tls\") pod \"switch-graph-60a5c-64b647654d-2knr4\" (UID: \"2fe3de8b-9f73-4927-8756-d868d7db6842\") " pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:20.688600 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:36:20.688562 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode972aafe_9982_429a_8ec8_39b81b77fc55.slice/crio-471459171cd5731dd71eb9554d834810b022f855884f44d48d8660b34bead68c\": RecentStats: unable to find data in memory cache]" Apr 20 20:36:20.716717 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.716690 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:20.838608 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.838586 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:36:20.851837 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.851811 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4"] Apr 20 20:36:20.854386 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:36:20.854353 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fe3de8b_9f73_4927_8756_d868d7db6842.slice/crio-7559a08e43a876e4c566b47e48240cf2161b861e523196de8c14a8a75145eee1 WatchSource:0}: Error finding container 7559a08e43a876e4c566b47e48240cf2161b861e523196de8c14a8a75145eee1: Status 404 returned error can't find the container with id 7559a08e43a876e4c566b47e48240cf2161b861e523196de8c14a8a75145eee1 Apr 20 20:36:20.866108 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.866084 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls\") pod \"e972aafe-9982-429a-8ec8-39b81b77fc55\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " Apr 20 20:36:20.866215 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.866199 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e972aafe-9982-429a-8ec8-39b81b77fc55-openshift-service-ca-bundle\") pod \"e972aafe-9982-429a-8ec8-39b81b77fc55\" (UID: \"e972aafe-9982-429a-8ec8-39b81b77fc55\") " Apr 20 20:36:20.866611 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.866574 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e972aafe-9982-429a-8ec8-39b81b77fc55-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "e972aafe-9982-429a-8ec8-39b81b77fc55" (UID: "e972aafe-9982-429a-8ec8-39b81b77fc55"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:36:20.868346 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.868324 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e972aafe-9982-429a-8ec8-39b81b77fc55" (UID: "e972aafe-9982-429a-8ec8-39b81b77fc55"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:36:20.967358 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.967284 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e972aafe-9982-429a-8ec8-39b81b77fc55-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:36:20.967358 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:20.967313 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e972aafe-9982-429a-8ec8-39b81b77fc55-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:36:21.068498 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.068463 2574 generic.go:358] "Generic (PLEG): container finished" podID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerID="7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3" exitCode=0 Apr 20 20:36:21.068669 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.068531 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" Apr 20 20:36:21.068669 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.068554 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" event={"ID":"e972aafe-9982-429a-8ec8-39b81b77fc55","Type":"ContainerDied","Data":"7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3"} Apr 20 20:36:21.068669 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.068599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5" event={"ID":"e972aafe-9982-429a-8ec8-39b81b77fc55","Type":"ContainerDied","Data":"471459171cd5731dd71eb9554d834810b022f855884f44d48d8660b34bead68c"} Apr 20 20:36:21.068669 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.068619 2574 scope.go:117] "RemoveContainer" containerID="7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3" Apr 20 20:36:21.070121 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.070091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" event={"ID":"2fe3de8b-9f73-4927-8756-d868d7db6842","Type":"ContainerStarted","Data":"f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071"} Apr 20 20:36:21.070240 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.070124 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" event={"ID":"2fe3de8b-9f73-4927-8756-d868d7db6842","Type":"ContainerStarted","Data":"7559a08e43a876e4c566b47e48240cf2161b861e523196de8c14a8a75145eee1"} Apr 20 20:36:21.070301 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.070258 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:21.077197 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.077152 2574 scope.go:117] "RemoveContainer" containerID="7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3" Apr 20 20:36:21.077397 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:36:21.077381 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3\": container with ID starting with 7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3 not found: ID does not exist" containerID="7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3" Apr 20 20:36:21.077441 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.077405 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3"} err="failed to get container status \"7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3\": rpc error: code = NotFound desc = could not find container \"7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3\": container with ID starting with 7902d20a22adc5223258b919a3ea90d9164e76bd8d74c5409f6223c2ca8d44a3 not found: ID does not exist" Apr 20 20:36:21.086684 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.086645 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" podStartSLOduration=1.086633421 podStartE2EDuration="1.086633421s" podCreationTimestamp="2026-04-20 20:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:36:21.085447213 +0000 UTC m=+1863.209140304" watchObservedRunningTime="2026-04-20 20:36:21.086633421 +0000 UTC m=+1863.210326547" Apr 20 20:36:21.096347 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.096325 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5"] Apr 20 20:36:21.099756 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:21.099730 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ab6b8-54bf769bd5-jnmp5"] Apr 20 20:36:22.465296 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:22.465263 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" path="/var/lib/kubelet/pods/e972aafe-9982-429a-8ec8-39b81b77fc55/volumes" Apr 20 20:36:27.079414 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:27.079341 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:36:50.900746 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:50.900712 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj"] Apr 20 20:36:50.901203 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:50.901098 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" Apr 20 20:36:50.901203 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:50.901112 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" Apr 20 20:36:50.901203 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:50.901186 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e972aafe-9982-429a-8ec8-39b81b77fc55" containerName="splitter-graph-ab6b8" Apr 20 20:36:50.904104 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:50.904088 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:50.906396 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:50.906367 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-57557-kube-rbac-proxy-sar-config\"" Apr 20 20:36:50.906543 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:50.906411 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-57557-serving-cert\"" Apr 20 20:36:50.911633 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:50.911608 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj"] Apr 20 20:36:51.026729 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.026690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-openshift-service-ca-bundle\") pod \"splitter-graph-57557-86766cd968-2dzqj\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:51.026914 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.026805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls\") pod \"splitter-graph-57557-86766cd968-2dzqj\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:51.128126 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.128088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls\") pod \"splitter-graph-57557-86766cd968-2dzqj\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:51.128308 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.128134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-openshift-service-ca-bundle\") pod \"splitter-graph-57557-86766cd968-2dzqj\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:51.128308 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:36:51.128244 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-57557-serving-cert: secret "splitter-graph-57557-serving-cert" not found Apr 20 20:36:51.128419 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:36:51.128338 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls podName:324cf3c9-ff6e-4742-bfe0-eb1d008344c2 nodeName:}" failed. No retries permitted until 2026-04-20 20:36:51.628321675 +0000 UTC m=+1893.752014741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls") pod "splitter-graph-57557-86766cd968-2dzqj" (UID: "324cf3c9-ff6e-4742-bfe0-eb1d008344c2") : secret "splitter-graph-57557-serving-cert" not found Apr 20 20:36:51.128731 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.128714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-openshift-service-ca-bundle\") pod \"splitter-graph-57557-86766cd968-2dzqj\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:51.633270 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.633231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls\") pod \"splitter-graph-57557-86766cd968-2dzqj\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:51.635644 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.635616 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls\") pod \"splitter-graph-57557-86766cd968-2dzqj\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:51.815442 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.815414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:51.935987 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:51.935876 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj"] Apr 20 20:36:51.938710 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:36:51.938677 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod324cf3c9_ff6e_4742_bfe0_eb1d008344c2.slice/crio-27774e14392deacf6837a16bbf610c4094e616fcfc05b12b09aacc4721087e47 WatchSource:0}: Error finding container 27774e14392deacf6837a16bbf610c4094e616fcfc05b12b09aacc4721087e47: Status 404 returned error can't find the container with id 27774e14392deacf6837a16bbf610c4094e616fcfc05b12b09aacc4721087e47 Apr 20 20:36:52.166899 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:52.166815 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" event={"ID":"324cf3c9-ff6e-4742-bfe0-eb1d008344c2","Type":"ContainerStarted","Data":"e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5"} Apr 20 20:36:52.166899 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:52.166852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" event={"ID":"324cf3c9-ff6e-4742-bfe0-eb1d008344c2","Type":"ContainerStarted","Data":"27774e14392deacf6837a16bbf610c4094e616fcfc05b12b09aacc4721087e47"} Apr 20 20:36:52.167110 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:52.166945 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:36:52.182000 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:52.181946 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" podStartSLOduration=2.181929488 podStartE2EDuration="2.181929488s" podCreationTimestamp="2026-04-20 20:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:36:52.181506553 +0000 UTC m=+1894.305199642" watchObservedRunningTime="2026-04-20 20:36:52.181929488 +0000 UTC m=+1894.305622568" Apr 20 20:36:58.175343 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:36:58.175316 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:40:18.490518 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:40:18.490485 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:40:18.495457 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:40:18.495434 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:40:18.495457 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:40:18.495444 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:40:18.500393 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:40:18.500375 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:45:05.473459 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:05.473424 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj"] Apr 20 20:45:05.475982 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:05.473645 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" containerID="cri-o://e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5" gracePeriod=30 Apr 20 20:45:08.173706 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:08.173669 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:45:13.174450 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:13.174412 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:45:18.174454 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:18.174411 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:45:18.174846 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:18.174532 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:45:18.513226 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:18.513143 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:45:18.518328 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:18.518306 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:45:18.519792 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:18.519764 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:45:18.524575 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:18.524553 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:45:23.174553 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:23.174513 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:45:28.174186 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:28.174104 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:45:33.173939 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:33.173887 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:45:35.609330 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.609307 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:45:35.672513 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.672477 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-openshift-service-ca-bundle\") pod \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " Apr 20 20:45:35.672513 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.672514 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls\") pod \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\" (UID: \"324cf3c9-ff6e-4742-bfe0-eb1d008344c2\") " Apr 20 20:45:35.672888 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.672864 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "324cf3c9-ff6e-4742-bfe0-eb1d008344c2" (UID: "324cf3c9-ff6e-4742-bfe0-eb1d008344c2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:45:35.674699 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.674673 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "324cf3c9-ff6e-4742-bfe0-eb1d008344c2" (UID: "324cf3c9-ff6e-4742-bfe0-eb1d008344c2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:45:35.731221 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.731139 2574 generic.go:358] "Generic (PLEG): container finished" podID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerID="e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5" exitCode=0 Apr 20 20:45:35.731221 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.731187 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" event={"ID":"324cf3c9-ff6e-4742-bfe0-eb1d008344c2","Type":"ContainerDied","Data":"e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5"} Apr 20 20:45:35.731221 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.731212 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" Apr 20 20:45:35.731460 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.731241 2574 scope.go:117] "RemoveContainer" containerID="e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5" Apr 20 20:45:35.731460 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.731214 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj" event={"ID":"324cf3c9-ff6e-4742-bfe0-eb1d008344c2","Type":"ContainerDied","Data":"27774e14392deacf6837a16bbf610c4094e616fcfc05b12b09aacc4721087e47"} Apr 20 20:45:35.739955 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.739935 2574 scope.go:117] "RemoveContainer" containerID="e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5" Apr 20 20:45:35.740236 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:45:35.740217 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5\": container with ID starting with e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5 not found: ID does not exist" containerID="e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5" Apr 20 20:45:35.740298 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.740246 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5"} err="failed to get container status \"e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5\": rpc error: code = NotFound desc = could not find container \"e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5\": container with ID starting with e8e314bbc0b90f0a0e250e601ba2a4536143e5fa118c3c30f788e5e26d7223f5 not found: ID does not exist" Apr 20 20:45:35.751927 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.751899 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj"] Apr 20 20:45:35.765746 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.765713 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-57557-86766cd968-2dzqj"] Apr 20 20:45:35.773595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.773575 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:45:35.773667 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:35.773599 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/324cf3c9-ff6e-4742-bfe0-eb1d008344c2-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:45:36.465523 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:45:36.465480 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" path="/var/lib/kubelet/pods/324cf3c9-ff6e-4742-bfe0-eb1d008344c2/volumes" Apr 20 20:50:18.539855 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:50:18.539827 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:50:18.544407 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:50:18.544384 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:50:18.544784 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:50:18.544768 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:50:18.549264 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:50:18.549248 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:52:39.885290 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:39.885259 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4"] Apr 20 20:52:39.885781 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:39.885513 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" containerID="cri-o://f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071" gracePeriod=30 Apr 20 20:52:40.842540 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.842504 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k2gzf/must-gather-wm9lc"] Apr 20 20:52:40.842850 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.842838 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" Apr 20 20:52:40.842902 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.842852 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" Apr 20 20:52:40.842957 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.842915 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="324cf3c9-ff6e-4742-bfe0-eb1d008344c2" containerName="splitter-graph-57557" Apr 20 20:52:40.845996 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.845979 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:52:40.848515 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.848481 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k2gzf\"/\"kube-root-ca.crt\"" Apr 20 20:52:40.848515 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.848490 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k2gzf\"/\"openshift-service-ca.crt\"" Apr 20 20:52:40.848696 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.848494 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-k2gzf\"/\"default-dockercfg-jh8fb\"" Apr 20 20:52:40.863380 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.863351 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gzf/must-gather-wm9lc"] Apr 20 20:52:40.924094 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.924067 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28b69\" (UniqueName: \"kubernetes.io/projected/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-kube-api-access-28b69\") pod \"must-gather-wm9lc\" (UID: \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\") " pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:52:40.924430 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:40.924113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-must-gather-output\") pod \"must-gather-wm9lc\" (UID: \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\") " pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:52:41.025046 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:41.025011 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28b69\" (UniqueName: \"kubernetes.io/projected/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-kube-api-access-28b69\") pod \"must-gather-wm9lc\" (UID: \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\") " pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:52:41.025179 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:41.025068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-must-gather-output\") pod \"must-gather-wm9lc\" (UID: \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\") " pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:52:41.025332 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:41.025317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-must-gather-output\") pod \"must-gather-wm9lc\" (UID: \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\") " pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:52:41.033416 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:41.033390 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28b69\" (UniqueName: \"kubernetes.io/projected/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-kube-api-access-28b69\") pod \"must-gather-wm9lc\" (UID: \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\") " pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:52:41.167771 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:41.167699 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:52:41.285488 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:41.285464 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gzf/must-gather-wm9lc"] Apr 20 20:52:41.288307 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:52:41.288266 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1259d5c2_b22d_41ee_a5fe_0f9f99248a97.slice/crio-963d296924c66d5291098e8ecf7d17740d11ceaeb3106aa2b76968bd186d9fe8 WatchSource:0}: Error finding container 963d296924c66d5291098e8ecf7d17740d11ceaeb3106aa2b76968bd186d9fe8: Status 404 returned error can't find the container with id 963d296924c66d5291098e8ecf7d17740d11ceaeb3106aa2b76968bd186d9fe8 Apr 20 20:52:41.290003 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:41.289985 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:52:41.989830 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:41.989787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" event={"ID":"1259d5c2-b22d-41ee-a5fe-0f9f99248a97","Type":"ContainerStarted","Data":"963d296924c66d5291098e8ecf7d17740d11ceaeb3106aa2b76968bd186d9fe8"} Apr 20 20:52:42.079542 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:42.079497 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:52:46.007272 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:46.005322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" event={"ID":"1259d5c2-b22d-41ee-a5fe-0f9f99248a97","Type":"ContainerStarted","Data":"2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e"} Apr 20 20:52:46.007272 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:46.005391 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" event={"ID":"1259d5c2-b22d-41ee-a5fe-0f9f99248a97","Type":"ContainerStarted","Data":"c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641"} Apr 20 20:52:46.021963 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:46.021913 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" podStartSLOduration=1.93213935 podStartE2EDuration="6.02189697s" podCreationTimestamp="2026-04-20 20:52:40 +0000 UTC" firstStartedPulling="2026-04-20 20:52:41.290163829 +0000 UTC m=+2843.413856895" lastFinishedPulling="2026-04-20 20:52:45.379921446 +0000 UTC m=+2847.503614515" observedRunningTime="2026-04-20 20:52:46.019442985 +0000 UTC m=+2848.143136072" watchObservedRunningTime="2026-04-20 20:52:46.02189697 +0000 UTC m=+2848.145590059" Apr 20 20:52:47.078776 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:47.078738 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:52:52.078126 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:52.078079 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:52:52.078513 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:52.078202 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:52:54.371989 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:54.371956 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:52:55.169621 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:55.169589 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:52:55.938805 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:55.938763 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:52:56.701075 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:56.701047 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:52:57.078124 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:57.078081 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:52:57.458199 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:57.458110 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:52:58.198314 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:58.198277 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:52:58.964350 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:58.964321 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:52:59.751394 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:52:59.751353 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:53:00.531901 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:00.531868 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:53:01.306480 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:01.306449 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:53:02.077962 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:02.077926 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:53:02.086413 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:02.086383 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:53:02.934361 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:02.934332 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-60a5c-64b647654d-2knr4_2fe3de8b-9f73-4927-8756-d868d7db6842/switch-graph-60a5c/0.log" Apr 20 20:53:04.067389 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:04.067360 2574 generic.go:358] "Generic (PLEG): container finished" podID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerID="c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641" exitCode=0 Apr 20 20:53:04.067890 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:04.067441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" event={"ID":"1259d5c2-b22d-41ee-a5fe-0f9f99248a97","Type":"ContainerDied","Data":"c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641"} Apr 20 20:53:04.067890 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:04.067817 2574 scope.go:117] "RemoveContainer" containerID="c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641" Apr 20 20:53:04.658740 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:04.658711 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k2gzf_must-gather-wm9lc_1259d5c2-b22d-41ee-a5fe-0f9f99248a97/gather/0.log" Apr 20 20:53:07.078215 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:07.078161 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:53:07.965210 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:07.965180 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-r5zhr_abe21bdd-fc08-4152-a25c-837a2c251a36/global-pull-secret-syncer/0.log" Apr 20 20:53:08.035926 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:08.035891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q6qs6_09d0b535-704a-4945-9235-0ddeba8ad00c/konnectivity-agent/0.log" Apr 20 20:53:08.117545 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:08.117510 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-234.ec2.internal_f8094a2875efe3934931c2701094dd6e/haproxy/0.log" Apr 20 20:53:10.045551 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.045526 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:53:10.086750 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.086721 2574 generic.go:358] "Generic (PLEG): container finished" podID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerID="f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071" exitCode=137 Apr 20 20:53:10.086911 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.086781 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" Apr 20 20:53:10.086911 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.086805 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" event={"ID":"2fe3de8b-9f73-4927-8756-d868d7db6842","Type":"ContainerDied","Data":"f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071"} Apr 20 20:53:10.086911 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.086852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4" event={"ID":"2fe3de8b-9f73-4927-8756-d868d7db6842","Type":"ContainerDied","Data":"7559a08e43a876e4c566b47e48240cf2161b861e523196de8c14a8a75145eee1"} Apr 20 20:53:10.086911 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.086872 2574 scope.go:117] "RemoveContainer" containerID="f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071" Apr 20 20:53:10.094962 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.094943 2574 scope.go:117] "RemoveContainer" containerID="f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071" Apr 20 20:53:10.095253 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:53:10.095227 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071\": container with ID starting with f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071 not found: ID does not exist" containerID="f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071" Apr 20 20:53:10.095351 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.095266 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071"} err="failed to get container status \"f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071\": rpc error: code = NotFound desc = could not find container \"f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071\": container with ID starting with f23717e77633a047e4b242eaa88c59ddce71712603e636bccd41eae393f58071 not found: ID does not exist" Apr 20 20:53:10.101822 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.101785 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k2gzf/must-gather-wm9lc"] Apr 20 20:53:10.102527 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.102061 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerName="copy" containerID="cri-o://2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e" gracePeriod=2 Apr 20 20:53:10.103882 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.103854 2574 status_manager.go:895] "Failed to get status for pod" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" err="pods \"must-gather-wm9lc\" is forbidden: User \"system:node:ip-10-0-131-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-k2gzf\": no relationship found between node 'ip-10-0-131-234.ec2.internal' and this object" Apr 20 20:53:10.104228 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.104207 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k2gzf/must-gather-wm9lc"] Apr 20 20:53:10.182875 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.182806 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe3de8b-9f73-4927-8756-d868d7db6842-openshift-service-ca-bundle\") pod \"2fe3de8b-9f73-4927-8756-d868d7db6842\" (UID: \"2fe3de8b-9f73-4927-8756-d868d7db6842\") " Apr 20 20:53:10.182999 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.182897 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fe3de8b-9f73-4927-8756-d868d7db6842-proxy-tls\") pod \"2fe3de8b-9f73-4927-8756-d868d7db6842\" (UID: \"2fe3de8b-9f73-4927-8756-d868d7db6842\") " Apr 20 20:53:10.183237 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.183213 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe3de8b-9f73-4927-8756-d868d7db6842-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2fe3de8b-9f73-4927-8756-d868d7db6842" (UID: "2fe3de8b-9f73-4927-8756-d868d7db6842"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:53:10.185009 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.184993 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe3de8b-9f73-4927-8756-d868d7db6842-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2fe3de8b-9f73-4927-8756-d868d7db6842" (UID: "2fe3de8b-9f73-4927-8756-d868d7db6842"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:53:10.283608 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.283578 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fe3de8b-9f73-4927-8756-d868d7db6842-proxy-tls\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:53:10.283608 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.283606 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe3de8b-9f73-4927-8756-d868d7db6842-openshift-service-ca-bundle\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:53:10.321105 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.321080 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k2gzf_must-gather-wm9lc_1259d5c2-b22d-41ee-a5fe-0f9f99248a97/copy/0.log" Apr 20 20:53:10.321493 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.321475 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:53:10.323487 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.323457 2574 status_manager.go:895] "Failed to get status for pod" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" err="pods \"must-gather-wm9lc\" is forbidden: User \"system:node:ip-10-0-131-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-k2gzf\": no relationship found between node 'ip-10-0-131-234.ec2.internal' and this object" Apr 20 20:53:10.396212 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.396183 2574 status_manager.go:895] "Failed to get status for pod" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" err="pods \"must-gather-wm9lc\" is forbidden: User \"system:node:ip-10-0-131-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-k2gzf\": no relationship found between node 'ip-10-0-131-234.ec2.internal' and this object" Apr 20 20:53:10.407365 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.407338 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4"] Apr 20 20:53:10.411075 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.411029 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-60a5c-64b647654d-2knr4"] Apr 20 20:53:10.471611 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.471539 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" path="/var/lib/kubelet/pods/2fe3de8b-9f73-4927-8756-d868d7db6842/volumes" Apr 20 20:53:10.485557 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.485520 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28b69\" (UniqueName: \"kubernetes.io/projected/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-kube-api-access-28b69\") pod \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\" (UID: \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\") " Apr 20 20:53:10.485677 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.485570 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-must-gather-output\") pod \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\" (UID: \"1259d5c2-b22d-41ee-a5fe-0f9f99248a97\") " Apr 20 20:53:10.487053 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.487007 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1259d5c2-b22d-41ee-a5fe-0f9f99248a97" (UID: "1259d5c2-b22d-41ee-a5fe-0f9f99248a97"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:53:10.487804 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.487781 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-kube-api-access-28b69" (OuterVolumeSpecName: "kube-api-access-28b69") pod "1259d5c2-b22d-41ee-a5fe-0f9f99248a97" (UID: "1259d5c2-b22d-41ee-a5fe-0f9f99248a97"). InnerVolumeSpecName "kube-api-access-28b69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:53:10.586462 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.586415 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28b69\" (UniqueName: \"kubernetes.io/projected/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-kube-api-access-28b69\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:53:10.586462 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:10.586461 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1259d5c2-b22d-41ee-a5fe-0f9f99248a97-must-gather-output\") on node \"ip-10-0-131-234.ec2.internal\" DevicePath \"\"" Apr 20 20:53:11.092070 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.092024 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k2gzf_must-gather-wm9lc_1259d5c2-b22d-41ee-a5fe-0f9f99248a97/copy/0.log" Apr 20 20:53:11.092436 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.092370 2574 generic.go:358] "Generic (PLEG): container finished" podID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerID="2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e" exitCode=143 Apr 20 20:53:11.092504 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.092447 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gzf/must-gather-wm9lc" Apr 20 20:53:11.092504 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.092467 2574 scope.go:117] "RemoveContainer" containerID="2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e" Apr 20 20:53:11.100631 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.100614 2574 scope.go:117] "RemoveContainer" containerID="c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641" Apr 20 20:53:11.112494 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.112474 2574 scope.go:117] "RemoveContainer" containerID="2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e" Apr 20 20:53:11.112761 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:53:11.112742 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e\": container with ID starting with 2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e not found: ID does not exist" containerID="2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e" Apr 20 20:53:11.112808 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.112771 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e"} err="failed to get container status \"2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e\": rpc error: code = NotFound desc = could not find container \"2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e\": container with ID starting with 2fde6546544e56006db061cbaa4ec2f0dc26655de6b4a67fa367a737f9f6c16e not found: ID does not exist" Apr 20 20:53:11.112808 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.112789 2574 scope.go:117] "RemoveContainer" containerID="c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641" Apr 20 20:53:11.113171 ip-10-0-131-234 kubenswrapper[2574]: E0420 20:53:11.113145 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641\": container with ID starting with c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641 not found: ID does not exist" containerID="c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641" Apr 20 20:53:11.113227 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.113180 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641"} err="failed to get container status \"c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641\": rpc error: code = NotFound desc = could not find container \"c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641\": container with ID starting with c625aeb98e48340143f118157c8065696937ef5ba7167ef929dcd2c5d705f641 not found: ID does not exist" Apr 20 20:53:11.702784 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.702712 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-fkttm_5999adb7-d895-4660-8bf8-546e2d8dd27c/cluster-monitoring-operator/0.log" Apr 20 20:53:11.810008 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.809985 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-68c68f5db6-2x85q_53b5cab7-44b2-44ea-a7a8-e00157572b77/metrics-server/0.log" Apr 20 20:53:11.836750 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:11.836729 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rbkcf_915408af-5ee3-4c92-a8f5-4cf9059a0be9/monitoring-plugin/0.log" Apr 20 20:53:12.004246 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.004175 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n59wp_c8d21f1e-73fa-43c3-aec2-17f03a870896/node-exporter/0.log" Apr 20 20:53:12.024275 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.024252 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n59wp_c8d21f1e-73fa-43c3-aec2-17f03a870896/kube-rbac-proxy/0.log" Apr 20 20:53:12.045469 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.045447 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n59wp_c8d21f1e-73fa-43c3-aec2-17f03a870896/init-textfile/0.log" Apr 20 20:53:12.157013 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.156986 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6fb70db0-02ba-449a-b860-74bc0fe90c9d/prometheus/0.log" Apr 20 20:53:12.175918 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.175891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6fb70db0-02ba-449a-b860-74bc0fe90c9d/config-reloader/0.log" Apr 20 20:53:12.198169 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.198150 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6fb70db0-02ba-449a-b860-74bc0fe90c9d/thanos-sidecar/0.log" Apr 20 20:53:12.225182 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.225159 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6fb70db0-02ba-449a-b860-74bc0fe90c9d/kube-rbac-proxy-web/0.log" Apr 20 20:53:12.244614 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.244597 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6fb70db0-02ba-449a-b860-74bc0fe90c9d/kube-rbac-proxy/0.log" Apr 20 20:53:12.264794 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.264776 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6fb70db0-02ba-449a-b860-74bc0fe90c9d/kube-rbac-proxy-thanos/0.log" Apr 20 20:53:12.286570 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.286546 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6fb70db0-02ba-449a-b860-74bc0fe90c9d/init-config-reloader/0.log" Apr 20 20:53:12.313356 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.313333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zbp64_73a10b51-1fea-4aab-81d5-a8e232d4623b/prometheus-operator/0.log" Apr 20 20:53:12.331747 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.331728 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zbp64_73a10b51-1fea-4aab-81d5-a8e232d4623b/kube-rbac-proxy/0.log" Apr 20 20:53:12.380360 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.380335 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f69c67d6d-ghrlk_c28d92ab-9264-413d-b83c-a088f773f9d1/telemeter-client/0.log" Apr 20 20:53:12.401894 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.401873 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f69c67d6d-ghrlk_c28d92ab-9264-413d-b83c-a088f773f9d1/reload/0.log" Apr 20 20:53:12.425283 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.425262 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f69c67d6d-ghrlk_c28d92ab-9264-413d-b83c-a088f773f9d1/kube-rbac-proxy/0.log" Apr 20 20:53:12.452718 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.452698 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b8bf87fb7-rhsfh_5371b507-255d-419d-a381-5da1d311fb71/thanos-query/0.log" Apr 20 20:53:12.465582 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.465557 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" path="/var/lib/kubelet/pods/1259d5c2-b22d-41ee-a5fe-0f9f99248a97/volumes" Apr 20 20:53:12.476487 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.476472 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b8bf87fb7-rhsfh_5371b507-255d-419d-a381-5da1d311fb71/kube-rbac-proxy-web/0.log" Apr 20 20:53:12.497916 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.497889 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b8bf87fb7-rhsfh_5371b507-255d-419d-a381-5da1d311fb71/kube-rbac-proxy/0.log" Apr 20 20:53:12.517836 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.517779 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b8bf87fb7-rhsfh_5371b507-255d-419d-a381-5da1d311fb71/prom-label-proxy/0.log" Apr 20 20:53:12.538844 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.538828 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b8bf87fb7-rhsfh_5371b507-255d-419d-a381-5da1d311fb71/kube-rbac-proxy-rules/0.log" Apr 20 20:53:12.566342 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:12.566310 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b8bf87fb7-rhsfh_5371b507-255d-419d-a381-5da1d311fb71/kube-rbac-proxy-metrics/0.log" Apr 20 20:53:14.126944 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.126903 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/2.log" Apr 20 20:53:14.130583 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.130561 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rxlpd_ee1fb92b-6a4c-4495-93a7-29206e6b8642/console-operator/3.log" Apr 20 20:53:14.871749 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.871719 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-56p98_b227ca2d-a313-4e30-ab0f-03bda0c2db1f/volume-data-source-validator/0.log" Apr 20 20:53:14.995734 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.995698 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx"] Apr 20 20:53:14.996024 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996012 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" Apr 20 20:53:14.996098 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996025 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" Apr 20 20:53:14.996098 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996054 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerName="copy" Apr 20 20:53:14.996098 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996061 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerName="copy" Apr 20 20:53:14.996098 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996068 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerName="gather" Apr 20 20:53:14.996098 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996073 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerName="gather" Apr 20 20:53:14.996251 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996124 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fe3de8b-9f73-4927-8756-d868d7db6842" containerName="switch-graph-60a5c" Apr 20 20:53:14.996251 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996134 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerName="gather" Apr 20 20:53:14.996251 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:14.996142 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1259d5c2-b22d-41ee-a5fe-0f9f99248a97" containerName="copy" Apr 20 20:53:15.003611 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.003586 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.005931 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.005909 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2h9np\"/\"kube-root-ca.crt\"" Apr 20 20:53:15.006076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.005909 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2h9np\"/\"openshift-service-ca.crt\"" Apr 20 20:53:15.006076 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.005913 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2h9np\"/\"default-dockercfg-9jbc7\"" Apr 20 20:53:15.010935 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.010904 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx"] Apr 20 20:53:15.120882 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.120849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-podres\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.120882 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.120882 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-sys\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.121112 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.120914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-lib-modules\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.121112 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.120986 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-proc\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.121112 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.121083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzjfs\" (UniqueName: \"kubernetes.io/projected/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-kube-api-access-nzjfs\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222199 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-podres\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222199 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-sys\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222199 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-lib-modules\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-proc\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzjfs\" (UniqueName: \"kubernetes.io/projected/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-kube-api-access-nzjfs\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222256 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-sys\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222300 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-podres\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-proc\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.222595 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.222376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-lib-modules\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.229446 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.229425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzjfs\" (UniqueName: \"kubernetes.io/projected/e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a-kube-api-access-nzjfs\") pod \"perf-node-gather-daemonset-xdwgx\" (UID: \"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.314427 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.314398 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:15.435156 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.435124 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx"] Apr 20 20:53:15.437848 ip-10-0-131-234 kubenswrapper[2574]: W0420 20:53:15.437817 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode6d1fbed_0a3c_45ac_bcee_a0e8d4f09e7a.slice/crio-75a4440e2961af3e3ebcb4b7c89c500335efbb9b8d88c9f9602bc340f690c2e3 WatchSource:0}: Error finding container 75a4440e2961af3e3ebcb4b7c89c500335efbb9b8d88c9f9602bc340f690c2e3: Status 404 returned error can't find the container with id 75a4440e2961af3e3ebcb4b7c89c500335efbb9b8d88c9f9602bc340f690c2e3 Apr 20 20:53:15.543960 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.543938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bh6cg_1c420267-955c-479f-93c5-f3be116a6270/dns/0.log" Apr 20 20:53:15.564057 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.564023 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bh6cg_1c420267-955c-479f-93c5-f3be116a6270/kube-rbac-proxy/0.log" Apr 20 20:53:15.671248 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:15.671225 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d4mxm_ebb6ab8a-7ed0-48ce-b19d-ba7a095c2e28/dns-node-resolver/0.log" Apr 20 20:53:16.110632 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:16.110600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" event={"ID":"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a","Type":"ContainerStarted","Data":"3eb86bb0ea9b16327ea2af1895c0de387620864ec8d317aabf088211f6cac8ba"} Apr 20 20:53:16.110632 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:16.110634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" event={"ID":"e6d1fbed-0a3c-45ac-bcee-a0e8d4f09e7a","Type":"ContainerStarted","Data":"75a4440e2961af3e3ebcb4b7c89c500335efbb9b8d88c9f9602bc340f690c2e3"} Apr 20 20:53:16.110855 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:16.110724 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:16.124480 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:16.124429 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" podStartSLOduration=2.124415423 podStartE2EDuration="2.124415423s" podCreationTimestamp="2026-04-20 20:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:53:16.123715304 +0000 UTC m=+2878.247408419" watchObservedRunningTime="2026-04-20 20:53:16.124415423 +0000 UTC m=+2878.248108510" Apr 20 20:53:16.154775 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:16.154752 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ssfnj_86459264-fd91-425e-8338-70b56d469a74/node-ca/0.log" Apr 20 20:53:17.214900 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:17.214856 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mwhhz_da8e59f9-df8a-4e18-98ec-09373ec8bee1/serve-healthcheck-canary/0.log" Apr 20 20:53:17.553203 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:17.553173 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-74t9l_78447207-5d22-46b9-9ad3-a68cd998c91a/insights-operator/0.log" Apr 20 20:53:17.553549 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:17.553535 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-74t9l_78447207-5d22-46b9-9ad3-a68cd998c91a/insights-operator/1.log" Apr 20 20:53:17.574637 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:17.574617 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jpmmm_5df7a791-739d-44ea-ba16-2093a320d5dd/kube-rbac-proxy/0.log" Apr 20 20:53:17.594192 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:17.594169 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jpmmm_5df7a791-739d-44ea-ba16-2093a320d5dd/exporter/0.log" Apr 20 20:53:17.614015 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:17.613993 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jpmmm_5df7a791-739d-44ea-ba16-2093a320d5dd/extractor/0.log" Apr 20 20:53:19.928741 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:19.928713 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-vsmvr_821b92e8-06a5-445e-8eef-c473c8b4846d/s3-init/0.log" Apr 20 20:53:22.123973 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:22.123948 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-xdwgx" Apr 20 20:53:23.411379 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:23.411347 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-8d89v_89657514-cc9b-40ee-80ca-4a2b6be50dc3/migrator/0.log" Apr 20 20:53:23.430266 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:23.430241 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-8d89v_89657514-cc9b-40ee-80ca-4a2b6be50dc3/graceful-termination/0.log" Apr 20 20:53:24.544926 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:24.544899 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4wcrc_f679dfd5-8c86-42f8-823c-e2c7b58decdf/kube-multus/0.log" Apr 20 20:53:24.797742 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:24.797717 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcb2v_d62d535b-7b78-4f80-8451-fabdfce754d7/kube-multus-additional-cni-plugins/0.log" Apr 20 20:53:24.822362 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:24.822340 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcb2v_d62d535b-7b78-4f80-8451-fabdfce754d7/egress-router-binary-copy/0.log" Apr 20 20:53:24.842984 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:24.842965 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcb2v_d62d535b-7b78-4f80-8451-fabdfce754d7/cni-plugins/0.log" Apr 20 20:53:24.863169 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:24.863144 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcb2v_d62d535b-7b78-4f80-8451-fabdfce754d7/bond-cni-plugin/0.log" Apr 20 20:53:24.889969 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:24.889946 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcb2v_d62d535b-7b78-4f80-8451-fabdfce754d7/routeoverride-cni/0.log" Apr 20 20:53:24.913867 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:24.913847 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcb2v_d62d535b-7b78-4f80-8451-fabdfce754d7/whereabouts-cni-bincopy/0.log" Apr 20 20:53:24.936213 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:24.936193 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcb2v_d62d535b-7b78-4f80-8451-fabdfce754d7/whereabouts-cni/0.log" Apr 20 20:53:25.223178 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:25.223154 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wktd8_184c92c6-a188-47c2-acbf-e9fe477d6c13/network-metrics-daemon/0.log" Apr 20 20:53:25.249454 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:25.249432 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wktd8_184c92c6-a188-47c2-acbf-e9fe477d6c13/kube-rbac-proxy/0.log" Apr 20 20:53:26.066274 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.066241 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-controller/0.log" Apr 20 20:53:26.086045 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.086008 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/0.log" Apr 20 20:53:26.097680 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.097653 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovn-acl-logging/1.log" Apr 20 20:53:26.115106 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.115071 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/kube-rbac-proxy-node/0.log" Apr 20 20:53:26.134644 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.134623 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:53:26.152066 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.152021 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/northd/0.log" Apr 20 20:53:26.170315 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.170287 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/nbdb/0.log" Apr 20 20:53:26.188492 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.188470 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/sbdb/0.log" Apr 20 20:53:26.286614 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:26.286582 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6wvn_eff2e20b-0b3f-4623-b2e9-68404cf5689f/ovnkube-controller/0.log" Apr 20 20:53:27.885409 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:27.885384 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5kgnf_655b7db6-852f-4d19-9975-31ad69976609/network-check-target-container/0.log" Apr 20 20:53:28.740395 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:28.740368 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-d4qvq_bf1e798d-74ef-4682-bd04-15da759fea59/iptables-alerter/0.log" Apr 20 20:53:29.326801 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:29.326606 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-j65bj_0ce4f429-5df3-4576-bea7-50ab8358d9f7/tuned/0.log" Apr 20 20:53:32.119375 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:32.119345 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-df6jv_86474c1c-4590-4d77-a856-e6d9ef5228d4/service-ca-controller/0.log" Apr 20 20:53:32.462366 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:32.462293 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-c9rks_e19165a1-00ed-47ec-bfb7-f7b723ee12ac/csi-driver/0.log" Apr 20 20:53:32.485325 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:32.485293 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-c9rks_e19165a1-00ed-47ec-bfb7-f7b723ee12ac/csi-node-driver-registrar/0.log" Apr 20 20:53:32.509455 ip-10-0-131-234 kubenswrapper[2574]: I0420 20:53:32.509429 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-c9rks_e19165a1-00ed-47ec-bfb7-f7b723ee12ac/csi-liveness-probe/0.log"