Apr 24 14:23:56.666651 ip-10-0-131-216 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:23:57.106380 ip-10-0-131-216 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:57.106380 ip-10-0-131-216 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:23:57.106380 ip-10-0-131-216 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:57.106380 ip-10-0-131-216 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:23:57.106380 ip-10-0-131-216 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:57.109573 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.109452 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:23:57.112686 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112671 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:57.112686 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112686 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112690 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112693 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112696 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112700 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112702 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112706 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112709 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112712 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112714 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112717 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112720 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112722 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112725 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112729 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112732 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112734 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112737 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112739 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:57.112749 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112742 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112744 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112747 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112749 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112752 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112755 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112757 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112760 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112763 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112766 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112769 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112772 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112776 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112779 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112782 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112785 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112787 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112790 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112792 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112794 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:57.113200 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112797 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112799 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112802 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112804 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112808 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112811 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112814 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112817 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112820 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112823 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112825 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112828 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112831 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112833 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112837 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112840 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112843 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112845 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112848 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:57.113897 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112850 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112853 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112856 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112859 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112862 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112864 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112867 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112869 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112872 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112875 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112877 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112879 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112882 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112884 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112887 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112889 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112892 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112894 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112898 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112901 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:57.114490 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112904 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:57.114986 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112906 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:57.114986 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112909 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:57.114986 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112912 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:57.114986 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112915 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:57.114986 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112918 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:57.114986 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.112921 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:57.115756 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115737 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115757 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115762 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115767 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115771 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115776 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115781 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115785 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115790 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:57.115793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115794 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115799 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115803 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115807 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115811 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115816 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115819 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115823 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115828 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115832 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115836 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115840 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115844 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115849 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115852 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115856 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115861 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115865 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115869 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115873 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:57.116019 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115878 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115882 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115885 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115888 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115890 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115893 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115896 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115899 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115902 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115904 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115907 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115910 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115914 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115917 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115921 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115925 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115928 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115930 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115933 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115936 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:57.116524 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115938 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115941 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115944 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115946 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115949 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115952 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115954 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115957 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115959 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115962 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115964 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115967 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115970 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115972 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115975 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115978 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115982 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115986 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115989 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:57.117020 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115992 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115995 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.115998 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116000 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116003 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116006 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116008 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116011 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116014 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116017 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116020 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116023 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116027 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116029 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116032 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116035 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116038 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.116040 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117105 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117115 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:23:57.117549 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117122 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117126 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117131 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117135 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117139 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117144 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117147 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117151 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117154 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117157 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117160 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117163 2574 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117166 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117169 2574 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117173 2574 flags.go:64] FLAG: --cloud-config="" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117176 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117179 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117184 2574 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117186 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117190 2574 flags.go:64] FLAG: --config-dir="" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117193 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117197 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117201 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117205 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:23:57.118048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117209 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117212 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117215 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117218 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117221 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117225 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117228 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117232 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117235 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117238 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117241 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117244 2574 flags.go:64] FLAG: --enable-server="true" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117247 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117251 2574 flags.go:64] FLAG: --event-burst="100" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117255 2574 flags.go:64] FLAG: --event-qps="50" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117258 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117261 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117264 2574 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117268 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117271 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117273 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117277 2574 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117280 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117283 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117285 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:23:57.118643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117289 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117292 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117295 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117298 2574 flags.go:64] FLAG: --feature-gates="" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117302 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117306 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117309 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117312 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117315 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117318 2574 flags.go:64] FLAG: --help="false" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117321 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117324 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117327 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117330 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117334 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117337 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117340 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117343 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117346 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117348 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117351 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117354 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117357 2574 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117360 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:23:57.119241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117363 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117366 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117369 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117372 2574 flags.go:64] FLAG: --lock-file="" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117374 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117381 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117384 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117389 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117404 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117408 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117412 2574 flags.go:64] FLAG: --logging-format="text" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117415 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117418 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117421 2574 flags.go:64] FLAG: --manifest-url="" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117425 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117429 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117433 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117437 2574 flags.go:64] FLAG: --max-pods="110" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117440 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117443 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117446 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117449 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117452 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117455 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117458 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:23:57.119874 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117466 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117470 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117473 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117476 2574 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117479 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117484 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117487 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117490 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117493 2574 flags.go:64] FLAG: --port="10250" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117496 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117499 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b246135c25e71a19" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117502 2574 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117505 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117510 2574 flags.go:64] FLAG: --register-node="true" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117513 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117516 2574 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117519 2574 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117522 2574 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117525 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117528 2574 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117532 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117534 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117537 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117541 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117544 2574 flags.go:64] FLAG: --runonce="false" Apr 24 14:23:57.120517 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117547 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117550 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117553 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117556 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117559 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117563 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117566 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117569 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117572 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117575 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117578 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117581 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117584 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117587 2574 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117589 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117595 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117598 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117601 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117606 2574 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117609 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117612 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117615 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117618 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117621 2574 flags.go:64] FLAG: --v="2" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117625 2574 flags.go:64] FLAG: --version="false" Apr 24 14:23:57.121130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117629 2574 flags.go:64] FLAG: --vmodule="" Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117634 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.117637 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117723 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117727 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117730 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117732 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117735 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117738 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117741 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117743 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117746 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117748 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117751 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117754 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117756 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117759 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117763 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117767 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:57.121747 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117770 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117772 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117776 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117779 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117782 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117785 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117788 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117790 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117793 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117796 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117798 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117800 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117803 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117806 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117808 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117811 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117814 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117818 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117820 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117823 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:57.122219 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117826 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117828 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117831 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117833 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117836 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117838 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117841 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117843 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117846 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117848 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117851 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117853 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117856 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117858 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117861 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117863 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117866 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117868 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117871 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117874 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:57.122793 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117877 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117879 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117882 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117884 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117887 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117889 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117892 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117894 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117897 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117899 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117902 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117912 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117915 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117917 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117921 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117923 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117926 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117928 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117931 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117933 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:57.123304 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117936 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117938 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117941 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117943 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117945 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117948 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117950 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117955 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117958 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.117960 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:57.123794 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.118996 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:57.126930 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.126904 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:23:57.126930 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.126929 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:23:57.127009 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.126993 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:57.127009 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.126999 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:57.127009 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127002 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:57.127009 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127005 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:57.127009 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127008 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:57.127009 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127011 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127014 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127017 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127020 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127022 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127025 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127027 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127030 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127033 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127035 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127038 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127041 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127044 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127046 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127049 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127051 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127054 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127057 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127059 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127061 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:57.127162 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127064 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127067 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127069 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127072 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127074 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127077 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127080 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127083 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127085 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127088 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127091 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127094 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127096 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127098 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127101 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127104 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127107 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127110 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127112 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127115 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:57.127679 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127118 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127120 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127123 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127125 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127128 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127131 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127133 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127136 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127138 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127142 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127144 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127147 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127149 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127152 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127155 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127158 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127161 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127163 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127166 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127169 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:57.128160 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127172 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127175 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127178 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127180 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127185 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127189 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127192 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127195 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127198 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127201 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127203 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127206 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127208 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127211 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127213 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127216 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127218 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127221 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127224 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:57.128671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127226 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127230 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.127235 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127328 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127331 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127335 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127338 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127341 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127344 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127346 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127349 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127351 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127354 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127357 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127360 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:57.129135 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127363 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127367 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127371 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127374 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127377 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127380 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127384 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127386 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127389 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127404 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127407 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127410 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127412 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127415 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127417 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127420 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127423 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127425 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127428 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:57.129551 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127431 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127433 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127436 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127439 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127441 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127444 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127447 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127449 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127452 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127454 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127457 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127459 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127463 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127465 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127468 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127471 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127474 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127477 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127479 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127482 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:57.130021 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127484 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127487 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127489 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127491 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127494 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127497 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127499 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127501 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127504 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127507 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127509 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127512 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127514 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127516 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127519 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127522 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127524 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127527 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127529 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127532 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:57.130530 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127534 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127537 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127539 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127542 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127544 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127548 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127551 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127554 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127557 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127560 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127562 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127565 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127567 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127570 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:57.127572 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:57.131013 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.127577 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:57.131407 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.128337 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:23:57.131607 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.131594 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:23:57.132546 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.132535 2574 server.go:1019] "Starting client certificate rotation" Apr 24 14:23:57.132642 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.132626 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:23:57.132672 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.132665 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:23:57.157006 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.156986 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:23:57.159445 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.159425 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:23:57.173793 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.173771 2574 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:23:57.181753 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.181737 2574 log.go:25] "Validated CRI v1 image API" Apr 24 14:23:57.183581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.183567 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:23:57.188431 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.188411 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:23:57.188623 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.188604 2574 fs.go:135] Filesystem UUIDs: map[1bbff9e8-63c9-4183-8cfb-6c856140dbb2:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c4a38924-15be-4b00-b1a5-8fd922a2d6c3:/dev/nvme0n1p3] Apr 24 14:23:57.188658 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.188625 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:23:57.194567 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.194461 2574 manager.go:217] Machine: {Timestamp:2026-04-24 14:23:57.192694482 +0000 UTC m=+0.404545849 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098298 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec283e1c2e0374daf1f2d561afa15603 SystemUUID:ec283e1c-2e03-74da-f1f2-d561afa15603 BootID:3d807602-bd5c-45e9-bcab-e975f6234ab3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6a:8b:87:dd:ff Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6a:8b:87:dd:ff Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fa:68:0d:a9:0a:25 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:23:57.194567 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.194557 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:23:57.194694 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.194664 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:23:57.195700 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.195674 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:23:57.195838 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.195701 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-216.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:23:57.195879 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.195847 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:23:57.195879 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.195856 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:23:57.195879 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.195873 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:23:57.196540 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.196529 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:23:57.197295 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.197286 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:23:57.197558 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.197549 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:23:57.200769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.200759 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:23:57.200800 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.200773 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:23:57.200800 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.200786 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:23:57.200800 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.200795 2574 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:23:57.200885 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.200803 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:23:57.201930 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.201918 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:23:57.201978 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.201938 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:23:57.206060 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.206038 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:23:57.207229 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.207214 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:23:57.210681 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210659 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:23:57.210681 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210677 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:23:57.210681 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210685 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210691 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210699 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210708 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210716 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210722 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210729 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210735 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210753 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:23:57.210863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.210762 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:23:57.212438 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.212427 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:23:57.212438 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.212437 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:23:57.212618 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.212603 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8ngql" Apr 24 14:23:57.213685 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.213657 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:23:57.213808 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.213725 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-216.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:23:57.215921 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.215907 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:23:57.216002 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.215941 2574 server.go:1295] "Started kubelet" Apr 24 14:23:57.216071 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.216017 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:23:57.216103 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.216029 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:23:57.216131 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.216107 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:23:57.216700 ip-10-0-131-216 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:23:57.218046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.217880 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:23:57.219158 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.219138 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:23:57.219419 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.219384 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8ngql" Apr 24 14:23:57.220794 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.220773 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-216.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:23:57.224465 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.224445 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:23:57.224553 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.224462 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:23:57.225052 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.225030 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:23:57.225143 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.225128 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:23:57.225143 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.225144 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:23:57.225277 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.225264 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:23:57.225331 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.225278 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:23:57.225371 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.225320 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:57.225529 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.225510 2574 factory.go:55] Registering systemd factory Apr 24 14:23:57.225588 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.225579 2574 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:23:57.226551 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.226493 2574 factory.go:153] Registering CRI-O factory Apr 24 14:23:57.226551 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.226521 2574 factory.go:223] Registration of the crio container factory successfully Apr 24 14:23:57.226688 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.226619 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:23:57.226688 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.226647 2574 factory.go:103] Registering Raw factory Apr 24 14:23:57.226688 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.226664 2574 manager.go:1196] Started watching for new ooms in manager Apr 24 14:23:57.227097 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.227046 2574 manager.go:319] Starting recovery of all containers Apr 24 14:23:57.228609 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.228556 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:23:57.232722 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.232681 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:23:57.234830 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.234814 2574 manager.go:324] Recovery completed Apr 24 14:23:57.236365 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.236343 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:57.239838 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.239826 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:57.241121 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.241103 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-216.ec2.internal\" not found" node="ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.242026 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.242013 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:57.242087 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.242037 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:57.242087 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.242047 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:57.242523 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.242506 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:23:57.242523 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.242518 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:23:57.242629 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.242533 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:23:57.244608 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.244596 2574 policy_none.go:49] "None policy: Start" Apr 24 14:23:57.244608 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.244611 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:23:57.244700 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.244620 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:23:57.280915 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.280901 2574 manager.go:341] "Starting Device Plugin manager" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.280967 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.280980 2574 server.go:85] "Starting device plugin registration server" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.281187 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.281197 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.281282 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.281351 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.281357 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.282838 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:23:57.290927 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.282887 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:57.353996 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.353973 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:23:57.354112 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.354004 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:23:57.354112 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.354018 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:23:57.354112 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.354024 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:23:57.354112 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.354061 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:23:57.356630 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.356582 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:57.381818 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.381794 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:57.382598 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.382583 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:57.382673 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.382609 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:57.382673 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.382620 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:57.382673 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.382643 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.391271 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.391255 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.391351 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.391280 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-216.ec2.internal\": node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:57.409097 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.409075 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:57.454596 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.454574 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal"] Apr 24 14:23:57.454678 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.454640 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:57.456032 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.456018 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:57.456118 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.456047 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:57.456118 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.456061 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:57.457064 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457050 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:57.457191 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.457240 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457204 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:57.457692 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457678 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:57.457769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457702 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:57.457769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457718 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:57.457769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457758 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:57.457923 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457780 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:57.457923 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.457796 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:57.458812 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.458797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.458891 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.458828 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:57.459570 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.459554 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:57.459665 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.459581 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:57.459665 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.459590 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:57.488012 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.487993 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-216.ec2.internal\" not found" node="ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.492469 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.492456 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-216.ec2.internal\" not found" node="ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.509609 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.509591 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:57.526891 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.526875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d10d73caf323405e00defaf97a76c78f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-216.ec2.internal\" (UID: \"d10d73caf323405e00defaf97a76c78f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.526959 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.526896 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/be5075edcaa05a69fffbb6ddcf4dd3b2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"be5075edcaa05a69fffbb6ddcf4dd3b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.526959 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.526914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5075edcaa05a69fffbb6ddcf4dd3b2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"be5075edcaa05a69fffbb6ddcf4dd3b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.610038 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.609967 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:57.627344 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.627330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d10d73caf323405e00defaf97a76c78f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-216.ec2.internal\" (UID: \"d10d73caf323405e00defaf97a76c78f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.627435 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.627355 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/be5075edcaa05a69fffbb6ddcf4dd3b2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"be5075edcaa05a69fffbb6ddcf4dd3b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.627435 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.627375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5075edcaa05a69fffbb6ddcf4dd3b2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"be5075edcaa05a69fffbb6ddcf4dd3b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.627511 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.627433 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/be5075edcaa05a69fffbb6ddcf4dd3b2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"be5075edcaa05a69fffbb6ddcf4dd3b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.627511 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.627491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5075edcaa05a69fffbb6ddcf4dd3b2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal\" (UID: \"be5075edcaa05a69fffbb6ddcf4dd3b2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.627580 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.627511 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d10d73caf323405e00defaf97a76c78f-config\") pod \"kube-apiserver-proxy-ip-10-0-131-216.ec2.internal\" (UID: \"d10d73caf323405e00defaf97a76c78f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.710724 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.710704 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:57.790270 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.790240 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.794870 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:57.794849 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:57.811297 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.811272 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:57.911828 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:57.911766 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:58.012305 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:58.012278 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:58.112798 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:58.112765 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:58.132283 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.132258 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:23:58.132432 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.132414 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:23:58.132478 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.132422 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:23:58.213895 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:58.213852 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:58.221175 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.221143 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:18:57 +0000 UTC" deadline="2027-11-02 20:51:31.811362787 +0000 UTC" Apr 24 14:23:58.221175 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.221173 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13374h27m33.590192562s" Apr 24 14:23:58.224952 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.224934 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:23:58.237322 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.237300 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:23:58.254924 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.254905 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tm48d" Apr 24 14:23:58.258865 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:58.258838 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd10d73caf323405e00defaf97a76c78f.slice/crio-d8d566b6ab47b5f27e41ff4a06df8d7681a76b71e22d01cae39430498fe60150 WatchSource:0}: Error finding container d8d566b6ab47b5f27e41ff4a06df8d7681a76b71e22d01cae39430498fe60150: Status 404 returned error can't find the container with id d8d566b6ab47b5f27e41ff4a06df8d7681a76b71e22d01cae39430498fe60150 Apr 24 14:23:58.259113 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:58.259095 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5075edcaa05a69fffbb6ddcf4dd3b2.slice/crio-1b0c4a555ccb73469fce44ea95f63d977c6ee6d7a5b119284b0d3c43b25ce237 WatchSource:0}: Error finding container 1b0c4a555ccb73469fce44ea95f63d977c6ee6d7a5b119284b0d3c43b25ce237: Status 404 returned error can't find the container with id 1b0c4a555ccb73469fce44ea95f63d977c6ee6d7a5b119284b0d3c43b25ce237 Apr 24 14:23:58.262514 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.262487 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tm48d" Apr 24 14:23:58.263002 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.262988 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:23:58.314250 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:58.314206 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:58.356369 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.356316 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" event={"ID":"d10d73caf323405e00defaf97a76c78f","Type":"ContainerStarted","Data":"d8d566b6ab47b5f27e41ff4a06df8d7681a76b71e22d01cae39430498fe60150"} Apr 24 14:23:58.357259 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.357239 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" event={"ID":"be5075edcaa05a69fffbb6ddcf4dd3b2","Type":"ContainerStarted","Data":"1b0c4a555ccb73469fce44ea95f63d977c6ee6d7a5b119284b0d3c43b25ce237"} Apr 24 14:23:58.414359 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:58.414326 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:58.514793 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:58.514736 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-216.ec2.internal\" not found" Apr 24 14:23:58.531882 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.531863 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:58.564255 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.564240 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:58.624974 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.624765 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" Apr 24 14:23:58.634865 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.634844 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:23:58.635832 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.635804 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" Apr 24 14:23:58.642952 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:58.642879 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:23:59.201821 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.201786 2574 apiserver.go:52] "Watching apiserver" Apr 24 14:23:59.202458 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.202440 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:59.206667 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.206645 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:23:59.208069 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.208041 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-kdc8j","kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4","openshift-dns/node-resolver-p8qh5","openshift-image-registry/node-ca-95f8z","openshift-multus/multus-zxsvs","openshift-network-diagnostics/network-check-target-6cqpp","openshift-network-operator/iptables-alerter-7x8dx","openshift-cluster-node-tuning-operator/tuned-nqxtx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal","openshift-multus/multus-additional-cni-plugins-8fqct","openshift-multus/network-metrics-daemon-n65kf","openshift-ovn-kubernetes/ovnkube-node-wbvmc"] Apr 24 14:23:59.209788 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.209730 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:23:59.210264 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.209881 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:23:59.211336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.211314 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.212374 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.212358 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.213090 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.213061 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:23:59.213182 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.213151 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:23:59.213261 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.213181 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:23:59.213323 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.213299 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-66tkf\"" Apr 24 14:23:59.213958 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.213930 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:23:59.214057 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.214038 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:23:59.214160 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.214144 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hr775\"" Apr 24 14:23:59.215409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.215372 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.216657 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.216641 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.217076 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.217058 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:23:59.217164 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.217094 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:23:59.217164 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.217122 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-szvkl\"" Apr 24 14:23:59.217275 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.217095 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:23:59.217756 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.217739 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:23:59.218227 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.218210 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:23:59.218524 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.218505 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:23:59.218627 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.218594 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:23:59.219101 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.218798 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:23:59.219101 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.218916 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.219347 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.219330 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:23:59.219855 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.219832 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:23:59.219945 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.219843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wmrp2\"" Apr 24 14:23:59.220012 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.219847 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tjq9f\"" Apr 24 14:23:59.220599 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.220579 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:23:59.220688 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.220619 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:23:59.220765 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.220747 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:23:59.220825 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.220811 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kv8sp\"" Apr 24 14:23:59.221255 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.221239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.221515 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.221496 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:59.222349 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.222331 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.223349 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.223330 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:23:59.223501 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.223484 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:23:59.223663 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.223642 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mbtvc\"" Apr 24 14:23:59.223753 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.223675 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:23:59.223810 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.223748 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:23:59.224179 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.223989 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:23:59.224179 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.224001 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:23:59.224179 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.224083 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mjlgd\"" Apr 24 14:23:59.225068 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.225049 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.226562 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.226449 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:23:59.226562 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.226524 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:23:59.226887 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.226871 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l5f74\"" Apr 24 14:23:59.227151 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.227135 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:23:59.227668 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.227648 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:23:59.227761 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.227730 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:23:59.227761 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.227748 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:23:59.227866 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.227764 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:23:59.237116 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.237205 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/52546bac-718f-4f97-8b34-9a2e8efca7e8-serviceca\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.237205 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237167 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-cni-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.237301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-kubelet\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.237301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237235 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/51de9bc5-cce7-429e-881f-d12cdc08346f-host-slash\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.237301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-log-socket\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.237301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-daemon-config\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237353 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-cni-netd\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-registration-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-sys-fs\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237482 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-modprobe-d\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-systemd\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237535 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-sys\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s6z4\" (UniqueName: \"kubernetes.io/projected/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-kube-api-access-8s6z4\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237590 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-cnibin\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237658 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-var-lib-kubelet\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psf7g\" (UniqueName: \"kubernetes.io/projected/52546bac-718f-4f97-8b34-9a2e8efca7e8-kube-api-access-psf7g\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-cni-bin\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237743 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-etc-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.238046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysctl-conf\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-os-release\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.237956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-955mk\" (UniqueName: \"kubernetes.io/projected/7c03ae59-e276-4d40-960a-9f006b958f5e-kube-api-access-955mk\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx7gq\" (UniqueName: \"kubernetes.io/projected/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-kube-api-access-lx7gq\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238084 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-kubernetes\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-ovnkube-config\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238179 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-system-cni-dir\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238326 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a74b1a4d-a0a7-4742-a775-7a58e287b451-cni-binary-copy\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238356 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-k8s-cni-cncf-io\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238409 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-systemd-units\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-hosts-file\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysctl-d\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5f2m\" (UniqueName: \"kubernetes.io/projected/a74b1a4d-a0a7-4742-a775-7a58e287b451-kube-api-access-m5f2m\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg2jb\" (UniqueName: \"kubernetes.io/projected/a216968f-e7d3-4145-b877-dbf4cfe8277a-kube-api-access-tg2jb\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238624 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c03ae59-e276-4d40-960a-9f006b958f5e-ovn-node-metrics-cert\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.238848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238653 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52546bac-718f-4f97-8b34-9a2e8efca7e8-host\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-multus-certs\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-run-netns\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.238932 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnxcx\" (UniqueName: \"kubernetes.io/projected/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-kube-api-access-bnxcx\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-netns\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239102 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-cni-multus\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0693b486-a773-4145-885d-daf067f39c8c-konnectivity-ca\") pod \"konnectivity-agent-kdc8j\" (UID: \"0693b486-a773-4145-885d-daf067f39c8c\") " pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt596\" (UniqueName: \"kubernetes.io/projected/51de9bc5-cce7-429e-881f-d12cdc08346f-kube-api-access-qt596\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-slash\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239276 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-env-overrides\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtlxj\" (UniqueName: \"kubernetes.io/projected/cd093142-e538-4326-bb16-c7b883e26fe2-kube-api-access-jtlxj\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-conf-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239448 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239499 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.239602 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239541 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-os-release\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/51de9bc5-cce7-429e-881f-d12cdc08346f-iptables-alerter-script\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239633 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-cni-bin\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.239681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-ovnkube-script-lib\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240098 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-socket-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-tmp-dir\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-system-cni-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240190 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cnibin\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240221 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-var-lib-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-device-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-host\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.240305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd093142-e538-4326-bb16-c7b883e26fe2-etc-tuned\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240312 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-hostroot\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240384 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-node-log\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-lib-modules\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240463 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd093142-e538-4326-bb16-c7b883e26fe2-tmp\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240499 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cni-binary-copy\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240557 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-socket-dir-parent\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0693b486-a773-4145-885d-daf067f39c8c-agent-certs\") pod \"konnectivity-agent-kdc8j\" (UID: \"0693b486-a773-4145-885d-daf067f39c8c\") " pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-kubelet\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-ovn\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysconfig\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-etc-kubernetes\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240732 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-systemd\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.240880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.240756 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-run\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.263149 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.263122 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:18:58 +0000 UTC" deadline="2027-10-30 09:50:18.598793898 +0000 UTC" Apr 24 14:23:59.263246 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.263148 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13291h26m19.335648736s" Apr 24 14:23:59.341166 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341139 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cni-binary-copy\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.341166 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-socket-dir-parent\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.341380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0693b486-a773-4145-885d-daf067f39c8c-agent-certs\") pod \"konnectivity-agent-kdc8j\" (UID: \"0693b486-a773-4145-885d-daf067f39c8c\") " pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:23:59.341380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-kubelet\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.341380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-socket-dir-parent\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.341380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-kubelet\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.341380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-ovn\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.341380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysconfig\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.341380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-etc-kubernetes\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-systemd\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341470 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysconfig\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-etc-kubernetes\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-ovn\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-run\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-systemd\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/52546bac-718f-4f97-8b34-9a2e8efca7e8-serviceca\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341600 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-run\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-cni-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341646 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-kubelet\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341677 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cni-binary-copy\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/51de9bc5-cce7-429e-881f-d12cdc08346f-host-slash\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.341718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-log-socket\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-daemon-config\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/51de9bc5-cce7-429e-881f-d12cdc08346f-host-slash\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-cni-netd\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-cni-netd\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341880 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-log-socket\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-registration-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-cni-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-sys-fs\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.341995 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-sys-fs\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342052 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-modprobe-d\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342064 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/52546bac-718f-4f97-8b34-9a2e8efca7e8-serviceca\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-systemd\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342080 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.342301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-kubelet\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-sys\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-registration-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s6z4\" (UniqueName: \"kubernetes.io/projected/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-kube-api-access-8s6z4\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342186 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-cnibin\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-sys\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-var-lib-kubelet\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psf7g\" (UniqueName: \"kubernetes.io/projected/52546bac-718f-4f97-8b34-9a2e8efca7e8-kube-api-access-psf7g\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-cni-bin\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-cnibin\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-etc-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysctl-conf\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-var-lib-kubelet\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-os-release\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.342990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-cni-bin\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-955mk\" (UniqueName: \"kubernetes.io/projected/7c03ae59-e276-4d40-960a-9f006b958f5e-kube-api-access-955mk\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-etc-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lx7gq\" (UniqueName: \"kubernetes.io/projected/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-kube-api-access-lx7gq\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342543 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-systemd\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysctl-conf\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342470 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-os-release\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-modprobe-d\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-kubernetes\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-daemon-config\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-ovnkube-config\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-system-cni-dir\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-kubernetes\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a74b1a4d-a0a7-4742-a775-7a58e287b451-cni-binary-copy\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-system-cni-dir\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-k8s-cni-cncf-io\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-systemd-units\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.343764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342866 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-hosts-file\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342879 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-k8s-cni-cncf-io\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-systemd-units\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysctl-d\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5f2m\" (UniqueName: \"kubernetes.io/projected/a74b1a4d-a0a7-4742-a775-7a58e287b451-kube-api-access-m5f2m\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg2jb\" (UniqueName: \"kubernetes.io/projected/a216968f-e7d3-4145-b877-dbf4cfe8277a-kube-api-access-tg2jb\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342972 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-hosts-file\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c03ae59-e276-4d40-960a-9f006b958f5e-ovn-node-metrics-cert\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343064 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-etc-sysctl-d\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.342943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-run-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-ovnkube-config\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52546bac-718f-4f97-8b34-9a2e8efca7e8-host\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a74b1a4d-a0a7-4742-a775-7a58e287b451-cni-binary-copy\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343697 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52546bac-718f-4f97-8b34-9a2e8efca7e8-host\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-multus-certs\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343833 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-multus-certs\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.344552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-run-netns\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnxcx\" (UniqueName: \"kubernetes.io/projected/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-kube-api-access-bnxcx\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-netns\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-cni-multus\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.343999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0693b486-a773-4145-885d-daf067f39c8c-konnectivity-ca\") pod \"konnectivity-agent-kdc8j\" (UID: \"0693b486-a773-4145-885d-daf067f39c8c\") " pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.344016 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344023 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt596\" (UniqueName: \"kubernetes.io/projected/51de9bc5-cce7-429e-881f-d12cdc08346f-kube-api-access-qt596\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-slash\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-env-overrides\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.344103 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs podName:a216968f-e7d3-4145-b877-dbf4cfe8277a nodeName:}" failed. No retries permitted until 2026-04-24 14:23:59.844072612 +0000 UTC m=+3.055923979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs") pod "network-metrics-daemon-n65kf" (UID: "a216968f-e7d3-4145-b877-dbf4cfe8277a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-run-netns\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtlxj\" (UniqueName: \"kubernetes.io/projected/cd093142-e538-4326-bb16-c7b883e26fe2-kube-api-access-jtlxj\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-conf-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.345310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-os-release\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/51de9bc5-cce7-429e-881f-d12cdc08346f-iptables-alerter-script\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-cni-bin\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344368 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-ovnkube-script-lib\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344390 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-socket-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-tmp-dir\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-system-cni-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cnibin\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344498 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-var-lib-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-env-overrides\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344572 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-slash\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-device-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.345763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0693b486-a773-4145-885d-daf067f39c8c-konnectivity-ca\") pod \"konnectivity-agent-kdc8j\" (UID: \"0693b486-a773-4145-885d-daf067f39c8c\") " pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.345164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-multus-conf-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.345780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-host\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.345861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd093142-e538-4326-bb16-c7b883e26fe2-etc-tuned\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.345897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-hostroot\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.346091 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.346009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-node-log\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.346068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-lib-modules\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.346095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd093142-e538-4326-bb16-c7b883e26fe2-tmp\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.346138 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7c03ae59-e276-4d40-960a-9f006b958f5e-ovn-node-metrics-cert\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.346179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-var-lib-openvswitch\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344524 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-socket-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.346181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-tmp-dir\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.344894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-device-dir\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.345674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.346775 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.346695 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0693b486-a773-4145-885d-daf067f39c8c-agent-certs\") pod \"konnectivity-agent-kdc8j\" (UID: \"0693b486-a773-4145-885d-daf067f39c8c\") " pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:23:59.348622 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.348587 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-system-cni-dir\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.348721 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.348630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-lib-modules\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.348721 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.348645 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-os-release\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.348792 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.348707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-cnibin\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.349236 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.349193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-node-log\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.349236 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.349197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.349833 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.349237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c03ae59-e276-4d40-960a-9f006b958f5e-host-cni-bin\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.349833 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.349342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-hostroot\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.349833 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.349372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd093142-e538-4326-bb16-c7b883e26fe2-host\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.349833 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.349429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-run-netns\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.349833 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.349454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a74b1a4d-a0a7-4742-a775-7a58e287b451-host-var-lib-cni-multus\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.350133 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.349988 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7c03ae59-e276-4d40-960a-9f006b958f5e-ovnkube-script-lib\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.350264 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.350248 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/51de9bc5-cce7-429e-881f-d12cdc08346f-iptables-alerter-script\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.351433 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.351370 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd093142-e538-4326-bb16-c7b883e26fe2-etc-tuned\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.351737 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.351713 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd093142-e538-4326-bb16-c7b883e26fe2-tmp\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.353808 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.353785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s6z4\" (UniqueName: \"kubernetes.io/projected/cce8ff4e-ca5b-4965-8469-359bef8e6cbe-kube-api-access-8s6z4\") pod \"multus-additional-cni-plugins-8fqct\" (UID: \"cce8ff4e-ca5b-4965-8469-359bef8e6cbe\") " pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.354202 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.354058 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg2jb\" (UniqueName: \"kubernetes.io/projected/a216968f-e7d3-4145-b877-dbf4cfe8277a-kube-api-access-tg2jb\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:23:59.354329 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.354310 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psf7g\" (UniqueName: \"kubernetes.io/projected/52546bac-718f-4f97-8b34-9a2e8efca7e8-kube-api-access-psf7g\") pod \"node-ca-95f8z\" (UID: \"52546bac-718f-4f97-8b34-9a2e8efca7e8\") " pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.354623 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.354571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-955mk\" (UniqueName: \"kubernetes.io/projected/7c03ae59-e276-4d40-960a-9f006b958f5e-kube-api-access-955mk\") pod \"ovnkube-node-wbvmc\" (UID: \"7c03ae59-e276-4d40-960a-9f006b958f5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.355097 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.355075 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx7gq\" (UniqueName: \"kubernetes.io/projected/73ca8cca-3e37-4e1d-a7a2-893ba4a811f4-kube-api-access-lx7gq\") pod \"aws-ebs-csi-driver-node-c8rg4\" (UID: \"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.360200 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.360182 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnxcx\" (UniqueName: \"kubernetes.io/projected/44a82b31-abfc-4f70-a1e3-54ed41d48cf7-kube-api-access-bnxcx\") pod \"node-resolver-p8qh5\" (UID: \"44a82b31-abfc-4f70-a1e3-54ed41d48cf7\") " pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.360431 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.360409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt596\" (UniqueName: \"kubernetes.io/projected/51de9bc5-cce7-429e-881f-d12cdc08346f-kube-api-access-qt596\") pod \"iptables-alerter-7x8dx\" (UID: \"51de9bc5-cce7-429e-881f-d12cdc08346f\") " pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.361674 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.361656 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5f2m\" (UniqueName: \"kubernetes.io/projected/a74b1a4d-a0a7-4742-a775-7a58e287b451-kube-api-access-m5f2m\") pod \"multus-zxsvs\" (UID: \"a74b1a4d-a0a7-4742-a775-7a58e287b451\") " pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.364645 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.364628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtlxj\" (UniqueName: \"kubernetes.io/projected/cd093142-e538-4326-bb16-c7b883e26fe2-kube-api-access-jtlxj\") pod \"tuned-nqxtx\" (UID: \"cd093142-e538-4326-bb16-c7b883e26fe2\") " pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.369246 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.369216 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:59.369246 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.369239 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:59.369414 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.369253 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ht87s for pod openshift-network-diagnostics/network-check-target-6cqpp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:59.369414 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.369312 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s podName:1de9757c-c280-4900-b19e-6918d88ee51e nodeName:}" failed. No retries permitted until 2026-04-24 14:23:59.869295486 +0000 UTC m=+3.081146856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ht87s" (UniqueName: "kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s") pod "network-check-target-6cqpp" (UID: "1de9757c-c280-4900-b19e-6918d88ee51e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:59.523207 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.523137 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" Apr 24 14:23:59.529582 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.529554 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p8qh5" Apr 24 14:23:59.540098 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.540076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-95f8z" Apr 24 14:23:59.544649 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.544628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zxsvs" Apr 24 14:23:59.551246 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.551228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:23:59.558762 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.558746 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7x8dx" Apr 24 14:23:59.565290 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.565263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" Apr 24 14:23:59.570777 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.570756 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8fqct" Apr 24 14:23:59.575348 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.575326 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:23:59.816732 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.816528 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c03ae59_e276_4d40_960a_9f006b958f5e.slice/crio-195fea1ae85390caae952e015744ffcb09c5f898c31699ceb8ccf497f7022a93 WatchSource:0}: Error finding container 195fea1ae85390caae952e015744ffcb09c5f898c31699ceb8ccf497f7022a93: Status 404 returned error can't find the container with id 195fea1ae85390caae952e015744ffcb09c5f898c31699ceb8ccf497f7022a93 Apr 24 14:23:59.817598 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.817571 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51de9bc5_cce7_429e_881f_d12cdc08346f.slice/crio-5d61d926f4ceaa21a87cc721e9164ef48544bc82a2fb0f032a2e4e5e935d3fbd WatchSource:0}: Error finding container 5d61d926f4ceaa21a87cc721e9164ef48544bc82a2fb0f032a2e4e5e935d3fbd: Status 404 returned error can't find the container with id 5d61d926f4ceaa21a87cc721e9164ef48544bc82a2fb0f032a2e4e5e935d3fbd Apr 24 14:23:59.822484 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.822461 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd093142_e538_4326_bb16_c7b883e26fe2.slice/crio-57eb16d8c079f5263050ea0f3a8e21037bf6eb2a17a4fdb3d57804e7a690deb1 WatchSource:0}: Error finding container 57eb16d8c079f5263050ea0f3a8e21037bf6eb2a17a4fdb3d57804e7a690deb1: Status 404 returned error can't find the container with id 57eb16d8c079f5263050ea0f3a8e21037bf6eb2a17a4fdb3d57804e7a690deb1 Apr 24 14:23:59.823074 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.823046 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a82b31_abfc_4f70_a1e3_54ed41d48cf7.slice/crio-6012b7bff211259b5d46e3fb109b997aea8b1e4ffd04b67a1a43511180ebae18 WatchSource:0}: Error finding container 6012b7bff211259b5d46e3fb109b997aea8b1e4ffd04b67a1a43511180ebae18: Status 404 returned error can't find the container with id 6012b7bff211259b5d46e3fb109b997aea8b1e4ffd04b67a1a43511180ebae18 Apr 24 14:23:59.824684 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.824461 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcce8ff4e_ca5b_4965_8469_359bef8e6cbe.slice/crio-e4121eb4c8dfaf43ff307f11fb20e3a2cf0bf0e4f8cffd692ddefbb8df7c22f2 WatchSource:0}: Error finding container e4121eb4c8dfaf43ff307f11fb20e3a2cf0bf0e4f8cffd692ddefbb8df7c22f2: Status 404 returned error can't find the container with id e4121eb4c8dfaf43ff307f11fb20e3a2cf0bf0e4f8cffd692ddefbb8df7c22f2 Apr 24 14:23:59.825237 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.825210 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74b1a4d_a0a7_4742_a775_7a58e287b451.slice/crio-0323461fbad21b360a26cb45c6e1447346693fb55ab01ff85478b443402c949b WatchSource:0}: Error finding container 0323461fbad21b360a26cb45c6e1447346693fb55ab01ff85478b443402c949b: Status 404 returned error can't find the container with id 0323461fbad21b360a26cb45c6e1447346693fb55ab01ff85478b443402c949b Apr 24 14:23:59.826607 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.826146 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0693b486_a773_4145_885d_daf067f39c8c.slice/crio-8ab386e36c0cdcc4809417b31ab13e17a915029053051ede655d7314401de3e0 WatchSource:0}: Error finding container 8ab386e36c0cdcc4809417b31ab13e17a915029053051ede655d7314401de3e0: Status 404 returned error can't find the container with id 8ab386e36c0cdcc4809417b31ab13e17a915029053051ede655d7314401de3e0 Apr 24 14:23:59.827582 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.826808 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52546bac_718f_4f97_8b34_9a2e8efca7e8.slice/crio-fe521c4e6a4e40386d83e7977f674049b1597d4fc299dc605bb399dbd1c147c8 WatchSource:0}: Error finding container fe521c4e6a4e40386d83e7977f674049b1597d4fc299dc605bb399dbd1c147c8: Status 404 returned error can't find the container with id fe521c4e6a4e40386d83e7977f674049b1597d4fc299dc605bb399dbd1c147c8 Apr 24 14:23:59.828488 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:23:59.827816 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ca8cca_3e37_4e1d_a7a2_893ba4a811f4.slice/crio-9b8132e18f6ec79f277197b2797cb1c752d556f5c74c085d8a671faa9eb20ad0 WatchSource:0}: Error finding container 9b8132e18f6ec79f277197b2797cb1c752d556f5c74c085d8a671faa9eb20ad0: Status 404 returned error can't find the container with id 9b8132e18f6ec79f277197b2797cb1c752d556f5c74c085d8a671faa9eb20ad0 Apr 24 14:23:59.849369 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.849349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:23:59.849508 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.849492 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:59.849554 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.849546 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs podName:a216968f-e7d3-4145-b877-dbf4cfe8277a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:00.849528984 +0000 UTC m=+4.061380345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs") pod "network-metrics-daemon-n65kf" (UID: "a216968f-e7d3-4145-b877-dbf4cfe8277a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:59.950594 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:23:59.950569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:23:59.950717 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.950675 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:59.950717 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.950687 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:59.950717 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.950696 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ht87s for pod openshift-network-diagnostics/network-check-target-6cqpp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:59.950806 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:23:59.950734 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s podName:1de9757c-c280-4900-b19e-6918d88ee51e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:00.950722383 +0000 UTC m=+4.162573736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ht87s" (UniqueName: "kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s") pod "network-check-target-6cqpp" (UID: "1de9757c-c280-4900-b19e-6918d88ee51e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:00.264274 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.264140 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:18:58 +0000 UTC" deadline="2027-10-01 19:35:49.881139924 +0000 UTC" Apr 24 14:24:00.264274 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.264178 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12605h11m49.616965691s" Apr 24 14:24:00.354297 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.354258 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:00.354480 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:00.354381 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:00.364837 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.364802 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" event={"ID":"d10d73caf323405e00defaf97a76c78f","Type":"ContainerStarted","Data":"0f3823b9f0cc964306ad61da2d22d29d1449fc233f6322a6b03bbd2543d71ede"} Apr 24 14:24:00.369759 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.369720 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8fqct" event={"ID":"cce8ff4e-ca5b-4965-8469-359bef8e6cbe","Type":"ContainerStarted","Data":"e4121eb4c8dfaf43ff307f11fb20e3a2cf0bf0e4f8cffd692ddefbb8df7c22f2"} Apr 24 14:24:00.371938 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.371912 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7x8dx" event={"ID":"51de9bc5-cce7-429e-881f-d12cdc08346f","Type":"ContainerStarted","Data":"5d61d926f4ceaa21a87cc721e9164ef48544bc82a2fb0f032a2e4e5e935d3fbd"} Apr 24 14:24:00.379843 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.379818 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"195fea1ae85390caae952e015744ffcb09c5f898c31699ceb8ccf497f7022a93"} Apr 24 14:24:00.392469 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.392444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p8qh5" event={"ID":"44a82b31-abfc-4f70-a1e3-54ed41d48cf7","Type":"ContainerStarted","Data":"6012b7bff211259b5d46e3fb109b997aea8b1e4ffd04b67a1a43511180ebae18"} Apr 24 14:24:00.396536 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.396511 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" event={"ID":"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4","Type":"ContainerStarted","Data":"9b8132e18f6ec79f277197b2797cb1c752d556f5c74c085d8a671faa9eb20ad0"} Apr 24 14:24:00.399820 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.399760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-95f8z" event={"ID":"52546bac-718f-4f97-8b34-9a2e8efca7e8","Type":"ContainerStarted","Data":"fe521c4e6a4e40386d83e7977f674049b1597d4fc299dc605bb399dbd1c147c8"} Apr 24 14:24:00.413331 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.413283 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kdc8j" event={"ID":"0693b486-a773-4145-885d-daf067f39c8c","Type":"ContainerStarted","Data":"8ab386e36c0cdcc4809417b31ab13e17a915029053051ede655d7314401de3e0"} Apr 24 14:24:00.420073 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.420048 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zxsvs" event={"ID":"a74b1a4d-a0a7-4742-a775-7a58e287b451","Type":"ContainerStarted","Data":"0323461fbad21b360a26cb45c6e1447346693fb55ab01ff85478b443402c949b"} Apr 24 14:24:00.422190 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.422164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" event={"ID":"cd093142-e538-4326-bb16-c7b883e26fe2","Type":"ContainerStarted","Data":"57eb16d8c079f5263050ea0f3a8e21037bf6eb2a17a4fdb3d57804e7a690deb1"} Apr 24 14:24:00.858918 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.858847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:00.859030 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:00.858962 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:00.859030 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:00.859020 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs podName:a216968f-e7d3-4145-b877-dbf4cfe8277a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:02.859006041 +0000 UTC m=+6.070857398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs") pod "network-metrics-daemon-n65kf" (UID: "a216968f-e7d3-4145-b877-dbf4cfe8277a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:00.959444 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:00.959412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:00.959619 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:00.959519 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:00.959619 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:00.959539 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:00.959619 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:00.959553 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ht87s for pod openshift-network-diagnostics/network-check-target-6cqpp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:00.959619 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:00.959609 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s podName:1de9757c-c280-4900-b19e-6918d88ee51e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:02.959590526 +0000 UTC m=+6.171441902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ht87s" (UniqueName: "kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s") pod "network-check-target-6cqpp" (UID: "1de9757c-c280-4900-b19e-6918d88ee51e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:01.358128 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:01.357432 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:01.358128 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:01.357569 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:01.432477 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:01.430984 2574 generic.go:358] "Generic (PLEG): container finished" podID="be5075edcaa05a69fffbb6ddcf4dd3b2" containerID="b00d4c6897c9890be446e94111b715c06e28ceee0a86216261ffabc9b01b443b" exitCode=0 Apr 24 14:24:01.432477 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:01.432380 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" event={"ID":"be5075edcaa05a69fffbb6ddcf4dd3b2","Type":"ContainerDied","Data":"b00d4c6897c9890be446e94111b715c06e28ceee0a86216261ffabc9b01b443b"} Apr 24 14:24:01.446242 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:01.445547 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-216.ec2.internal" podStartSLOduration=3.445529769 podStartE2EDuration="3.445529769s" podCreationTimestamp="2026-04-24 14:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:00.375249428 +0000 UTC m=+3.587100805" watchObservedRunningTime="2026-04-24 14:24:01.445529769 +0000 UTC m=+4.657381149" Apr 24 14:24:02.355154 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:02.355075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:02.355321 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:02.355203 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:02.439126 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:02.439061 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" event={"ID":"be5075edcaa05a69fffbb6ddcf4dd3b2","Type":"ContainerStarted","Data":"67910c4e71bb173c0423b47badc15e5fd425ad4fb53eacf32a871102a113dee9"} Apr 24 14:24:02.875512 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:02.874880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:02.875512 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:02.875060 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:02.875512 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:02.875128 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs podName:a216968f-e7d3-4145-b877-dbf4cfe8277a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:06.875110077 +0000 UTC m=+10.086961433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs") pod "network-metrics-daemon-n65kf" (UID: "a216968f-e7d3-4145-b877-dbf4cfe8277a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:02.976176 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:02.976137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:02.976342 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:02.976303 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:02.976342 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:02.976321 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:02.976342 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:02.976334 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ht87s for pod openshift-network-diagnostics/network-check-target-6cqpp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:02.976528 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:02.976390 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s podName:1de9757c-c280-4900-b19e-6918d88ee51e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:06.976372159 +0000 UTC m=+10.188223525 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ht87s" (UniqueName: "kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s") pod "network-check-target-6cqpp" (UID: "1de9757c-c280-4900-b19e-6918d88ee51e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:03.355976 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:03.355443 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:03.355976 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:03.355594 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:04.354548 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:04.354450 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:04.355007 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:04.354592 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:05.354664 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:05.354545 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:05.355184 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:05.354707 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:06.355173 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:06.354664 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:06.355173 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:06.354793 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:06.910765 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:06.910723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:06.910948 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:06.910928 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:06.911017 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:06.910989 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs podName:a216968f-e7d3-4145-b877-dbf4cfe8277a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:14.910971401 +0000 UTC m=+18.122822771 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs") pod "network-metrics-daemon-n65kf" (UID: "a216968f-e7d3-4145-b877-dbf4cfe8277a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:07.012690 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:07.012297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:07.012690 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:07.012483 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:07.012690 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:07.012505 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:07.012690 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:07.012517 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ht87s for pod openshift-network-diagnostics/network-check-target-6cqpp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:07.012690 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:07.012570 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s podName:1de9757c-c280-4900-b19e-6918d88ee51e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:15.012553395 +0000 UTC m=+18.224404751 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ht87s" (UniqueName: "kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s") pod "network-check-target-6cqpp" (UID: "1de9757c-c280-4900-b19e-6918d88ee51e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:07.355682 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:07.355172 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:07.355682 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:07.355292 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:08.354994 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:08.354899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:08.355163 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:08.355037 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:09.355194 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:09.355139 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:09.355656 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:09.355301 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:10.354633 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:10.354597 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:10.354825 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:10.354717 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:11.355350 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:11.355313 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:11.355786 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:11.355472 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:12.354679 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:12.354644 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:12.354839 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:12.354759 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:13.355157 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:13.355126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:13.355613 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:13.355234 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:14.354893 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:14.354859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:14.355087 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:14.354968 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:14.971234 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:14.971186 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:14.971658 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:14.971353 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:14.971658 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:14.971446 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs podName:a216968f-e7d3-4145-b877-dbf4cfe8277a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:30.97142431 +0000 UTC m=+34.183275681 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs") pod "network-metrics-daemon-n65kf" (UID: "a216968f-e7d3-4145-b877-dbf4cfe8277a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:15.072330 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:15.072290 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:15.072510 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:15.072453 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:15.072510 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:15.072475 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:15.072510 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:15.072487 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ht87s for pod openshift-network-diagnostics/network-check-target-6cqpp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:15.072730 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:15.072545 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s podName:1de9757c-c280-4900-b19e-6918d88ee51e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:31.072528593 +0000 UTC m=+34.284379946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ht87s" (UniqueName: "kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s") pod "network-check-target-6cqpp" (UID: "1de9757c-c280-4900-b19e-6918d88ee51e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:15.355167 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:15.355094 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:15.355299 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:15.355238 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:16.354798 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:16.354760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:16.355182 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:16.354884 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:17.355470 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.355177 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:17.356184 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:17.355538 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:17.464111 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.464073 2574 generic.go:358] "Generic (PLEG): container finished" podID="cce8ff4e-ca5b-4965-8469-359bef8e6cbe" containerID="8d352443008aa023da075cd9693f9a160d67480de35dc5f2bfe4427180ca201a" exitCode=0 Apr 24 14:24:17.464254 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.464128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8fqct" event={"ID":"cce8ff4e-ca5b-4965-8469-359bef8e6cbe","Type":"ContainerDied","Data":"8d352443008aa023da075cd9693f9a160d67480de35dc5f2bfe4427180ca201a"} Apr 24 14:24:17.466617 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.466590 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"666fea9b9930284fa6f88ed149ef5fa0459769de28b4ca3798c1489e12ed4df7"} Apr 24 14:24:17.466751 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.466621 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"56ed816363b60a583c26da9711f8063f9fac6e7b72c8a4d6d971686325ad6960"} Apr 24 14:24:17.466751 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.466651 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"238bc16687de1e6f1c9eecab901786152d12e7f95aae9d0255d904967cf06417"} Apr 24 14:24:17.466751 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.466660 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"5dfa930f45d73fe2a2b7876f4cf289404cbf6eaa7d7102499bc56cf48f36bc18"} Apr 24 14:24:17.468064 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.467947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p8qh5" event={"ID":"44a82b31-abfc-4f70-a1e3-54ed41d48cf7","Type":"ContainerStarted","Data":"c49de764f9795d09aa93230b14d68b4f32dd0a142c2c68c9c34d41df468d62ab"} Apr 24 14:24:17.469428 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.469373 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" event={"ID":"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4","Type":"ContainerStarted","Data":"aa81f9a563588eaa05aeff62c93cd11f2708e3b822a3f4e533aa9052f7f4ea49"} Apr 24 14:24:17.472814 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.472791 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-95f8z" event={"ID":"52546bac-718f-4f97-8b34-9a2e8efca7e8","Type":"ContainerStarted","Data":"91edbc39659d9eb55ed0bd98eb561a7740cc660ffa791cb9e87f40869fc32c5b"} Apr 24 14:24:17.474048 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.474029 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kdc8j" event={"ID":"0693b486-a773-4145-885d-daf067f39c8c","Type":"ContainerStarted","Data":"095b4945a806cac5ab900497f64bfc7f4c9ba005130bc32fbec90ac097f1fdf3"} Apr 24 14:24:17.475488 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.475459 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zxsvs" event={"ID":"a74b1a4d-a0a7-4742-a775-7a58e287b451","Type":"ContainerStarted","Data":"49aa62815535326f1741dac7c108440d6e34d5bbff21e0a255fa11f20a3937c7"} Apr 24 14:24:17.476669 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.476644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" event={"ID":"cd093142-e538-4326-bb16-c7b883e26fe2","Type":"ContainerStarted","Data":"f7d26326b58439b0a3eed9175f5a477d46e7809820edf106abcda3cf657c3442"} Apr 24 14:24:17.482825 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.482790 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-216.ec2.internal" podStartSLOduration=19.482779346 podStartE2EDuration="19.482779346s" podCreationTimestamp="2026-04-24 14:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:02.450181798 +0000 UTC m=+5.662033175" watchObservedRunningTime="2026-04-24 14:24:17.482779346 +0000 UTC m=+20.694630721" Apr 24 14:24:17.495281 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.495237 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kdc8j" podStartSLOduration=11.577536602 podStartE2EDuration="20.495227894s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.828694337 +0000 UTC m=+3.040545690" lastFinishedPulling="2026-04-24 14:24:08.746385616 +0000 UTC m=+11.958236982" observedRunningTime="2026-04-24 14:24:17.494481963 +0000 UTC m=+20.706333339" watchObservedRunningTime="2026-04-24 14:24:17.495227894 +0000 UTC m=+20.707079269" Apr 24 14:24:17.509151 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.509114 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nqxtx" podStartSLOduration=3.651257545 podStartE2EDuration="20.509099617s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.824470032 +0000 UTC m=+3.036321384" lastFinishedPulling="2026-04-24 14:24:16.682311905 +0000 UTC m=+19.894163456" observedRunningTime="2026-04-24 14:24:17.508550147 +0000 UTC m=+20.720401523" watchObservedRunningTime="2026-04-24 14:24:17.509099617 +0000 UTC m=+20.720950993" Apr 24 14:24:17.520425 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.520376 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p8qh5" podStartSLOduration=3.696534681 podStartE2EDuration="20.520366972s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.824807842 +0000 UTC m=+3.036659196" lastFinishedPulling="2026-04-24 14:24:16.648640124 +0000 UTC m=+19.860491487" observedRunningTime="2026-04-24 14:24:17.520181824 +0000 UTC m=+20.732033201" watchObservedRunningTime="2026-04-24 14:24:17.520366972 +0000 UTC m=+20.732218347" Apr 24 14:24:17.534868 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.534821 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-95f8z" podStartSLOduration=3.751342458 podStartE2EDuration="20.534809193s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.828869082 +0000 UTC m=+3.040720450" lastFinishedPulling="2026-04-24 14:24:16.612335826 +0000 UTC m=+19.824187185" observedRunningTime="2026-04-24 14:24:17.534308865 +0000 UTC m=+20.746160240" watchObservedRunningTime="2026-04-24 14:24:17.534809193 +0000 UTC m=+20.746660570" Apr 24 14:24:17.548661 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.548623 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zxsvs" podStartSLOduration=3.691274751 podStartE2EDuration="20.548615815s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.827386789 +0000 UTC m=+3.039238143" lastFinishedPulling="2026-04-24 14:24:16.684727841 +0000 UTC m=+19.896579207" observedRunningTime="2026-04-24 14:24:17.548541471 +0000 UTC m=+20.760392847" watchObservedRunningTime="2026-04-24 14:24:17.548615815 +0000 UTC m=+20.760467193" Apr 24 14:24:17.810357 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:17.810337 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:24:18.292795 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.292465 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:24:17.810353402Z","UUID":"565e71b2-3e6e-414b-b383-942a52c112e9","Handler":null,"Name":"","Endpoint":""} Apr 24 14:24:18.295710 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.295683 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:24:18.295710 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.295716 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:24:18.354480 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.354379 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:18.354613 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:18.354515 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:18.480017 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.479980 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" event={"ID":"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4","Type":"ContainerStarted","Data":"67ce166712355b05203af4ef3b52a382495e8c59942bd4ba5b09d5ff6509e0de"} Apr 24 14:24:18.481335 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.481309 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7x8dx" event={"ID":"51de9bc5-cce7-429e-881f-d12cdc08346f","Type":"ContainerStarted","Data":"f3f5dbb9c9dfe379b29b7c88bd52a7d7580da2da32e0fc98be4e9f23120395f7"} Apr 24 14:24:18.485311 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.485133 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"81ee8060a93768a6259ddb8ffb08bc4817dc7c96231c66a7379ac1482a580849"} Apr 24 14:24:18.485311 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.485163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"465e2a91dc901ed1f34640a849dbec35e8a4302ee76c5b9955eb98fd0374cd34"} Apr 24 14:24:18.493767 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:18.493718 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7x8dx" podStartSLOduration=4.665323128 podStartE2EDuration="21.493700246s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.820264354 +0000 UTC m=+3.032115708" lastFinishedPulling="2026-04-24 14:24:16.648641459 +0000 UTC m=+19.860492826" observedRunningTime="2026-04-24 14:24:18.493624642 +0000 UTC m=+21.705476019" watchObservedRunningTime="2026-04-24 14:24:18.493700246 +0000 UTC m=+21.705551626" Apr 24 14:24:19.355238 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:19.355204 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:19.355539 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:19.355341 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:19.488257 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:19.488213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" event={"ID":"73ca8cca-3e37-4e1d-a7a2-893ba4a811f4","Type":"ContainerStarted","Data":"0844562c91e15ed93c3212a432f4ee7a41398f157841290785ee019d85fe9bbb"} Apr 24 14:24:19.504305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:19.504257 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c8rg4" podStartSLOduration=3.74683639 podStartE2EDuration="22.504240987s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.829673247 +0000 UTC m=+3.041524601" lastFinishedPulling="2026-04-24 14:24:18.587077844 +0000 UTC m=+21.798929198" observedRunningTime="2026-04-24 14:24:19.504059206 +0000 UTC m=+22.715910583" watchObservedRunningTime="2026-04-24 14:24:19.504240987 +0000 UTC m=+22.716092367" Apr 24 14:24:19.600318 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:19.600281 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:24:19.600957 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:19.600932 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:24:20.354382 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:20.354353 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:20.354559 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:20.354486 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:20.493600 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:20.493567 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"32abc3de7d7a70cb475e78e94224b4c760a8ccd536c76e7d90dba6382bbed1b6"} Apr 24 14:24:20.494225 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:20.493978 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:24:20.494380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:20.494361 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kdc8j" Apr 24 14:24:21.358502 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:21.358464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:21.358682 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:21.358606 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:22.354794 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.354597 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:22.355289 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:22.354814 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:22.498549 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.498514 2574 generic.go:358] "Generic (PLEG): container finished" podID="cce8ff4e-ca5b-4965-8469-359bef8e6cbe" containerID="3bcfd06bfa15bdf034db392971b31dcbd9ade2bf4feae760f45c1791ab3f90a1" exitCode=0 Apr 24 14:24:22.498684 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.498589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8fqct" event={"ID":"cce8ff4e-ca5b-4965-8469-359bef8e6cbe","Type":"ContainerDied","Data":"3bcfd06bfa15bdf034db392971b31dcbd9ade2bf4feae760f45c1791ab3f90a1"} Apr 24 14:24:22.501966 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.501937 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" event={"ID":"7c03ae59-e276-4d40-960a-9f006b958f5e","Type":"ContainerStarted","Data":"e6680492132dfbd3ac85c11ca7c20a33763d8262c408348e93603cef245d3151"} Apr 24 14:24:22.505368 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.504502 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:24:22.505368 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.504608 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:24:22.505368 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.504622 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:24:22.519919 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.519894 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:24:22.520095 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.520076 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:24:22.543239 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:22.543202 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" podStartSLOduration=8.252713712 podStartE2EDuration="25.54319218s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.819531521 +0000 UTC m=+3.031382876" lastFinishedPulling="2026-04-24 14:24:17.11000999 +0000 UTC m=+20.321861344" observedRunningTime="2026-04-24 14:24:22.542688645 +0000 UTC m=+25.754540019" watchObservedRunningTime="2026-04-24 14:24:22.54319218 +0000 UTC m=+25.755043555" Apr 24 14:24:23.355320 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:23.355298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:23.355635 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:23.355412 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:23.505754 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:23.505679 2574 generic.go:358] "Generic (PLEG): container finished" podID="cce8ff4e-ca5b-4965-8469-359bef8e6cbe" containerID="361d395dc47d27a37bbdce5fb81a618602948ad4c7ef2eba3a3084230b8028c2" exitCode=0 Apr 24 14:24:23.505878 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:23.505769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8fqct" event={"ID":"cce8ff4e-ca5b-4965-8469-359bef8e6cbe","Type":"ContainerDied","Data":"361d395dc47d27a37bbdce5fb81a618602948ad4c7ef2eba3a3084230b8028c2"} Apr 24 14:24:23.932289 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:23.932216 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n65kf"] Apr 24 14:24:23.932457 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:23.932346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:23.932534 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:23.932485 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:23.934346 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:23.934316 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6cqpp"] Apr 24 14:24:23.934485 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:23.934427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:23.934537 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:23.934514 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:24.509993 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:24.509958 2574 generic.go:358] "Generic (PLEG): container finished" podID="cce8ff4e-ca5b-4965-8469-359bef8e6cbe" containerID="3f5bbd93e8a54c1e54b07bda5e7207766d5e95ae6426c9e39fd64b67cf0d1bb6" exitCode=0 Apr 24 14:24:24.510445 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:24.510040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8fqct" event={"ID":"cce8ff4e-ca5b-4965-8469-359bef8e6cbe","Type":"ContainerDied","Data":"3f5bbd93e8a54c1e54b07bda5e7207766d5e95ae6426c9e39fd64b67cf0d1bb6"} Apr 24 14:24:25.355204 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:25.355170 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:25.355382 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:25.355324 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:25.355477 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:25.355385 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:25.355545 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:25.355523 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:27.355781 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:27.355749 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:27.356453 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:27.355855 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:27.356453 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:27.355923 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:27.356453 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:27.356025 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:29.354834 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.354799 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:29.355385 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.354841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:29.355385 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:29.354918 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6cqpp" podUID="1de9757c-c280-4900-b19e-6918d88ee51e" Apr 24 14:24:29.355385 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:29.355052 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n65kf" podUID="a216968f-e7d3-4145-b877-dbf4cfe8277a" Apr 24 14:24:29.628783 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.628703 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-216.ec2.internal" event="NodeReady" Apr 24 14:24:29.628932 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.628847 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:24:29.673840 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.673731 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2ggts"] Apr 24 14:24:29.702736 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.702711 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tvx6q"] Apr 24 14:24:29.702893 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.702867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.705327 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.705304 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m8ml5\"" Apr 24 14:24:29.705822 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.705797 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:24:29.706108 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.706089 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:24:29.723971 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.723950 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2ggts"] Apr 24 14:24:29.723971 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.723975 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tvx6q"] Apr 24 14:24:29.724133 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.724070 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:29.725884 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.725863 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:24:29.725995 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.725912 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:24:29.726061 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.726025 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-84zkd\"" Apr 24 14:24:29.726203 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.726185 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:24:29.884187 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.884107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f0062e0-6c81-4d0d-a829-f8f572d6038e-tmp-dir\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.884187 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.884150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4llk\" (UniqueName: \"kubernetes.io/projected/3f0062e0-6c81-4d0d-a829-f8f572d6038e-kube-api-access-t4llk\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.884187 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.884176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:29.884510 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.884224 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9tzc\" (UniqueName: \"kubernetes.io/projected/cf952f8e-c033-4ad1-a839-92bb755b49cc-kube-api-access-j9tzc\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:29.884510 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.884384 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.884510 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.884446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f0062e0-6c81-4d0d-a829-f8f572d6038e-config-volume\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.985527 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.985488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.985701 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.985549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f0062e0-6c81-4d0d-a829-f8f572d6038e-config-volume\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.985701 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.985588 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f0062e0-6c81-4d0d-a829-f8f572d6038e-tmp-dir\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.985701 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.985612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4llk\" (UniqueName: \"kubernetes.io/projected/3f0062e0-6c81-4d0d-a829-f8f572d6038e-kube-api-access-t4llk\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.985701 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:29.985622 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:29.985701 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.985640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:29.985701 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.985664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9tzc\" (UniqueName: \"kubernetes.io/projected/cf952f8e-c033-4ad1-a839-92bb755b49cc-kube-api-access-j9tzc\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:29.985701 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:29.985697 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls podName:3f0062e0-6c81-4d0d-a829-f8f572d6038e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:30.485677289 +0000 UTC m=+33.697528658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls") pod "dns-default-2ggts" (UID: "3f0062e0-6c81-4d0d-a829-f8f572d6038e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:29.986068 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:29.985981 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:29.986068 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:29.986029 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert podName:cf952f8e-c033-4ad1-a839-92bb755b49cc nodeName:}" failed. No retries permitted until 2026-04-24 14:24:30.486011904 +0000 UTC m=+33.697863262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert") pod "ingress-canary-tvx6q" (UID: "cf952f8e-c033-4ad1-a839-92bb755b49cc") : secret "canary-serving-cert" not found Apr 24 14:24:29.986068 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.986048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f0062e0-6c81-4d0d-a829-f8f572d6038e-tmp-dir\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.986288 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.986249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f0062e0-6c81-4d0d-a829-f8f572d6038e-config-volume\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.995511 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.995490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4llk\" (UniqueName: \"kubernetes.io/projected/3f0062e0-6c81-4d0d-a829-f8f572d6038e-kube-api-access-t4llk\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:29.995657 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:29.995637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9tzc\" (UniqueName: \"kubernetes.io/projected/cf952f8e-c033-4ad1-a839-92bb755b49cc-kube-api-access-j9tzc\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:30.490512 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:30.490325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:30.490869 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:30.490530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:30.490869 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:30.490483 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:30.490869 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:30.490605 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:30.490869 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:30.490637 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls podName:3f0062e0-6c81-4d0d-a829-f8f572d6038e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:31.490621359 +0000 UTC m=+34.702472715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls") pod "dns-default-2ggts" (UID: "3f0062e0-6c81-4d0d-a829-f8f572d6038e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:30.490869 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:30.490652 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert podName:cf952f8e-c033-4ad1-a839-92bb755b49cc nodeName:}" failed. No retries permitted until 2026-04-24 14:24:31.490645082 +0000 UTC m=+34.702496435 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert") pod "ingress-canary-tvx6q" (UID: "cf952f8e-c033-4ad1-a839-92bb755b49cc") : secret "canary-serving-cert" not found Apr 24 14:24:30.994351 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:30.994324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:30.994509 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:30.994471 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:30.994553 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:30.994523 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs podName:a216968f-e7d3-4145-b877-dbf4cfe8277a nodeName:}" failed. No retries permitted until 2026-04-24 14:25:02.994509568 +0000 UTC m=+66.206360925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs") pod "network-metrics-daemon-n65kf" (UID: "a216968f-e7d3-4145-b877-dbf4cfe8277a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:31.095641 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.095618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:31.095781 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:31.095763 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:31.095819 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:31.095784 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:31.095819 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:31.095794 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ht87s for pod openshift-network-diagnostics/network-check-target-6cqpp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:31.095880 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:31.095842 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s podName:1de9757c-c280-4900-b19e-6918d88ee51e nodeName:}" failed. No retries permitted until 2026-04-24 14:25:03.095829122 +0000 UTC m=+66.307680480 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ht87s" (UniqueName: "kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s") pod "network-check-target-6cqpp" (UID: "1de9757c-c280-4900-b19e-6918d88ee51e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:31.357998 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.357931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:24:31.358121 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.357931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:24:31.359977 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.359954 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:31.360490 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.360473 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:31.360490 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.360486 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68j6s\"" Apr 24 14:24:31.360617 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.360484 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:31.360617 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.360491 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pdcbb\"" Apr 24 14:24:31.499186 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.499156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:31.499562 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.499230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:31.499562 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:31.499304 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:31.499562 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:31.499322 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:31.499562 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:31.499364 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert podName:cf952f8e-c033-4ad1-a839-92bb755b49cc nodeName:}" failed. No retries permitted until 2026-04-24 14:24:33.499349644 +0000 UTC m=+36.711200998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert") pod "ingress-canary-tvx6q" (UID: "cf952f8e-c033-4ad1-a839-92bb755b49cc") : secret "canary-serving-cert" not found Apr 24 14:24:31.499562 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:31.499377 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls podName:3f0062e0-6c81-4d0d-a829-f8f572d6038e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:33.499371189 +0000 UTC m=+36.711222542 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls") pod "dns-default-2ggts" (UID: "3f0062e0-6c81-4d0d-a829-f8f572d6038e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:31.525828 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.525796 2574 generic.go:358] "Generic (PLEG): container finished" podID="cce8ff4e-ca5b-4965-8469-359bef8e6cbe" containerID="a840b5e17f76247ab43d858f31b757b2dfc13d4cc2850d7adfc06daf2db373c8" exitCode=0 Apr 24 14:24:31.525972 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:31.525832 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8fqct" event={"ID":"cce8ff4e-ca5b-4965-8469-359bef8e6cbe","Type":"ContainerDied","Data":"a840b5e17f76247ab43d858f31b757b2dfc13d4cc2850d7adfc06daf2db373c8"} Apr 24 14:24:32.529717 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:32.529688 2574 generic.go:358] "Generic (PLEG): container finished" podID="cce8ff4e-ca5b-4965-8469-359bef8e6cbe" containerID="1c48a8b85fd6a97daaf1c4c93acfd10d79906a38b423b25942f1413ee4cbd5b9" exitCode=0 Apr 24 14:24:32.530111 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:32.529749 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8fqct" event={"ID":"cce8ff4e-ca5b-4965-8469-359bef8e6cbe","Type":"ContainerDied","Data":"1c48a8b85fd6a97daaf1c4c93acfd10d79906a38b423b25942f1413ee4cbd5b9"} Apr 24 14:24:33.513482 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:33.513450 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:33.513593 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:33.513527 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:33.513639 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:33.513597 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:33.513639 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:33.513610 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:33.513707 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:33.513660 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls podName:3f0062e0-6c81-4d0d-a829-f8f572d6038e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:37.513644845 +0000 UTC m=+40.725496197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls") pod "dns-default-2ggts" (UID: "3f0062e0-6c81-4d0d-a829-f8f572d6038e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:33.513707 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:33.513674 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert podName:cf952f8e-c033-4ad1-a839-92bb755b49cc nodeName:}" failed. No retries permitted until 2026-04-24 14:24:37.513668752 +0000 UTC m=+40.725520104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert") pod "ingress-canary-tvx6q" (UID: "cf952f8e-c033-4ad1-a839-92bb755b49cc") : secret "canary-serving-cert" not found Apr 24 14:24:33.534031 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:33.534003 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8fqct" event={"ID":"cce8ff4e-ca5b-4965-8469-359bef8e6cbe","Type":"ContainerStarted","Data":"c1528f4c2eb95090d7bc0ced76c17dfa3ea8a09250dd2b870894d07a74cc4a8d"} Apr 24 14:24:33.554235 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:33.554191 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8fqct" podStartSLOduration=5.849230786 podStartE2EDuration="36.554178857s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:23:59.82627222 +0000 UTC m=+3.038123586" lastFinishedPulling="2026-04-24 14:24:30.53122029 +0000 UTC m=+33.743071657" observedRunningTime="2026-04-24 14:24:33.553018763 +0000 UTC m=+36.764870135" watchObservedRunningTime="2026-04-24 14:24:33.554178857 +0000 UTC m=+36.766030247" Apr 24 14:24:37.539529 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:37.539493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:37.539963 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:37.539570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:37.539963 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:37.539638 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:37.539963 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:37.539702 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls podName:3f0062e0-6c81-4d0d-a829-f8f572d6038e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:45.539686284 +0000 UTC m=+48.751537641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls") pod "dns-default-2ggts" (UID: "3f0062e0-6c81-4d0d-a829-f8f572d6038e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:37.539963 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:37.539638 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:37.539963 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:37.539763 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert podName:cf952f8e-c033-4ad1-a839-92bb755b49cc nodeName:}" failed. No retries permitted until 2026-04-24 14:24:45.539750989 +0000 UTC m=+48.751602357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert") pod "ingress-canary-tvx6q" (UID: "cf952f8e-c033-4ad1-a839-92bb755b49cc") : secret "canary-serving-cert" not found Apr 24 14:24:45.591945 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:45.591907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:24:45.591945 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:45.591951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:24:45.592353 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:45.592047 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:45.592353 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:45.592049 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:45.592353 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:45.592107 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert podName:cf952f8e-c033-4ad1-a839-92bb755b49cc nodeName:}" failed. No retries permitted until 2026-04-24 14:25:01.592093245 +0000 UTC m=+64.803944599 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert") pod "ingress-canary-tvx6q" (UID: "cf952f8e-c033-4ad1-a839-92bb755b49cc") : secret "canary-serving-cert" not found Apr 24 14:24:45.592353 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:24:45.592119 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls podName:3f0062e0-6c81-4d0d-a829-f8f572d6038e nodeName:}" failed. No retries permitted until 2026-04-24 14:25:01.592113704 +0000 UTC m=+64.803965057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls") pod "dns-default-2ggts" (UID: "3f0062e0-6c81-4d0d-a829-f8f572d6038e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:54.526481 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:24:54.526449 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wbvmc" Apr 24 14:25:01.605905 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:01.605863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:25:01.606352 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:01.605956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:25:01.606352 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:01.606015 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:01.606352 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:01.606032 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:01.606352 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:01.606092 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert podName:cf952f8e-c033-4ad1-a839-92bb755b49cc nodeName:}" failed. No retries permitted until 2026-04-24 14:25:33.606077178 +0000 UTC m=+96.817928534 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert") pod "ingress-canary-tvx6q" (UID: "cf952f8e-c033-4ad1-a839-92bb755b49cc") : secret "canary-serving-cert" not found Apr 24 14:25:01.606352 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:01.606109 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls podName:3f0062e0-6c81-4d0d-a829-f8f572d6038e nodeName:}" failed. No retries permitted until 2026-04-24 14:25:33.606102647 +0000 UTC m=+96.817953999 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls") pod "dns-default-2ggts" (UID: "3f0062e0-6c81-4d0d-a829-f8f572d6038e") : secret "dns-default-metrics-tls" not found Apr 24 14:25:03.016268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.016221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:25:03.018379 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.018358 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:25:03.026843 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:03.026828 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:25:03.026895 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:03.026888 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs podName:a216968f-e7d3-4145-b877-dbf4cfe8277a nodeName:}" failed. No retries permitted until 2026-04-24 14:26:07.026873515 +0000 UTC m=+130.238724868 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs") pod "network-metrics-daemon-n65kf" (UID: "a216968f-e7d3-4145-b877-dbf4cfe8277a") : secret "metrics-daemon-secret" not found Apr 24 14:25:03.116842 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.116814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:25:03.119216 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.119199 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:25:03.129189 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.129167 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:25:03.141908 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.141871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht87s\" (UniqueName: \"kubernetes.io/projected/1de9757c-c280-4900-b19e-6918d88ee51e-kube-api-access-ht87s\") pod \"network-check-target-6cqpp\" (UID: \"1de9757c-c280-4900-b19e-6918d88ee51e\") " pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:25:03.169545 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.169520 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pdcbb\"" Apr 24 14:25:03.178930 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.178909 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:25:03.331640 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.331583 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6cqpp"] Apr 24 14:25:03.335514 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:25:03.335490 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de9757c_c280_4900_b19e_6918d88ee51e.slice/crio-7bfe78f8ec51d5351fbd394ca70b26761f251725a6bc5eefdc9885193304ec79 WatchSource:0}: Error finding container 7bfe78f8ec51d5351fbd394ca70b26761f251725a6bc5eefdc9885193304ec79: Status 404 returned error can't find the container with id 7bfe78f8ec51d5351fbd394ca70b26761f251725a6bc5eefdc9885193304ec79 Apr 24 14:25:03.597887 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:03.597844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6cqpp" event={"ID":"1de9757c-c280-4900-b19e-6918d88ee51e","Type":"ContainerStarted","Data":"7bfe78f8ec51d5351fbd394ca70b26761f251725a6bc5eefdc9885193304ec79"} Apr 24 14:25:06.604335 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:06.604306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6cqpp" event={"ID":"1de9757c-c280-4900-b19e-6918d88ee51e","Type":"ContainerStarted","Data":"7e4785f88eb29c1c25a8d7b5ef1831954ce88180bade5b9f7ccb2a31b56d52d8"} Apr 24 14:25:06.604720 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:06.604442 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:25:33.625305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:33.625262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:25:33.625305 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:33.625314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:25:33.625833 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:33.625455 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:33.625833 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:33.625470 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:33.625833 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:33.625538 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert podName:cf952f8e-c033-4ad1-a839-92bb755b49cc nodeName:}" failed. No retries permitted until 2026-04-24 14:26:37.62552157 +0000 UTC m=+160.837372927 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert") pod "ingress-canary-tvx6q" (UID: "cf952f8e-c033-4ad1-a839-92bb755b49cc") : secret "canary-serving-cert" not found Apr 24 14:25:33.625833 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:33.625619 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls podName:3f0062e0-6c81-4d0d-a829-f8f572d6038e nodeName:}" failed. No retries permitted until 2026-04-24 14:26:37.625598799 +0000 UTC m=+160.837450154 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls") pod "dns-default-2ggts" (UID: "3f0062e0-6c81-4d0d-a829-f8f572d6038e") : secret "dns-default-metrics-tls" not found Apr 24 14:25:37.608535 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:37.608504 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6cqpp" Apr 24 14:25:37.621503 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:37.621457 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6cqpp" podStartSLOduration=97.875038011 podStartE2EDuration="1m40.621444963s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:25:03.33724175 +0000 UTC m=+66.549093108" lastFinishedPulling="2026-04-24 14:25:06.083648708 +0000 UTC m=+69.295500060" observedRunningTime="2026-04-24 14:25:06.617991119 +0000 UTC m=+69.829842508" watchObservedRunningTime="2026-04-24 14:25:37.621444963 +0000 UTC m=+100.833296339" Apr 24 14:25:48.843855 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.843821 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z"] Apr 24 14:25:48.846780 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.846751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z" Apr 24 14:25:48.847036 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.847013 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-577fb5f5fd-t2ghs"] Apr 24 14:25:48.848856 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.848834 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:25:48.848955 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.848912 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-j78k5\"" Apr 24 14:25:48.848955 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.848917 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 14:25:48.849849 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.849834 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v7trz"] Apr 24 14:25:48.849978 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.849963 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:48.851565 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.851550 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 14:25:48.851661 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.851624 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 14:25:48.851723 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.851670 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 14:25:48.851915 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.851901 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nmdvk\"" Apr 24 14:25:48.851967 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.851932 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 14:25:48.852005 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.851977 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 14:25:48.852460 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.852446 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 14:25:48.852877 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.852860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:48.854338 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.854296 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z"] Apr 24 14:25:48.854552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.854536 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 14:25:48.854678 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.854659 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 14:25:48.854816 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.854744 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 14:25:48.854816 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.854774 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:25:48.854933 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.854835 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-q7zbj\"" Apr 24 14:25:48.860747 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.860727 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 14:25:48.863266 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.863244 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v7trz"] Apr 24 14:25:48.864678 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.864659 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-577fb5f5fd-t2ghs"] Apr 24 14:25:48.922371 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq2rn\" (UniqueName: \"kubernetes.io/projected/8a1f01af-d685-4103-bebf-0d55fcb83c35-kube-api-access-tq2rn\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:48.922568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swxps\" (UniqueName: \"kubernetes.io/projected/c32378d6-79f4-4462-a6bc-310eaafe2cac-kube-api-access-swxps\") pod \"volume-data-source-validator-7c6cbb6c87-nx84z\" (UID: \"c32378d6-79f4-4462-a6bc-310eaafe2cac\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z" Apr 24 14:25:48.922568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922451 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:48.922568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:48.922568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922536 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvv6s\" (UniqueName: \"kubernetes.io/projected/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-kube-api-access-qvv6s\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:48.922784 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922580 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-trusted-ca\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:48.922784 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-config\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:48.922784 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-serving-cert\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:48.922784 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922693 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-default-certificate\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:48.922784 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.922717 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-stats-auth\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:48.947974 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.947948 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n"] Apr 24 14:25:48.950911 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.950894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:48.953084 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.953067 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 14:25:48.953409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.953378 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:25:48.953504 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.953377 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 14:25:48.953639 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.953615 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 14:25:48.953755 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.953690 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-gnr98\"" Apr 24 14:25:48.972191 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:48.972169 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n"] Apr 24 14:25:49.023855 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.023829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49abdf75-9c98-4426-953c-83a9aa6a3869-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.023969 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.023865 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49abdf75-9c98-4426-953c-83a9aa6a3869-config\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.023969 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.023886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq2rn\" (UniqueName: \"kubernetes.io/projected/8a1f01af-d685-4103-bebf-0d55fcb83c35-kube-api-access-tq2rn\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.023969 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.023908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swxps\" (UniqueName: \"kubernetes.io/projected/c32378d6-79f4-4462-a6bc-310eaafe2cac-kube-api-access-swxps\") pod \"volume-data-source-validator-7c6cbb6c87-nx84z\" (UID: \"c32378d6-79f4-4462-a6bc-310eaafe2cac\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z" Apr 24 14:25:49.023969 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.023950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.023979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tqlm\" (UniqueName: \"kubernetes.io/projected/49abdf75-9c98-4426-953c-83a9aa6a3869-kube-api-access-6tqlm\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvv6s\" (UniqueName: \"kubernetes.io/projected/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-kube-api-access-qvv6s\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-trusted-ca\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:49.024093 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:49.024105 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:49.524086631 +0000 UTC m=+112.735938014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : configmap references non-existent config key: service-ca.crt Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-config\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:49.024158 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:49.524142048 +0000 UTC m=+112.735993418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : secret "router-metrics-certs-default" not found Apr 24 14:25:49.024183 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-serving-cert\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.024646 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-default-certificate\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.024646 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-stats-auth\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.024998 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-trusted-ca\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.024998 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.024972 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-config\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.026625 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.026605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-serving-cert\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.027121 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.027096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-stats-auth\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.027189 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.027125 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-default-certificate\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.032827 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.032807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq2rn\" (UniqueName: \"kubernetes.io/projected/8a1f01af-d685-4103-bebf-0d55fcb83c35-kube-api-access-tq2rn\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.033168 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.033145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvv6s\" (UniqueName: \"kubernetes.io/projected/ff3b99d4-3afa-4687-b6b7-7d3526edbcf4-kube-api-access-qvv6s\") pod \"console-operator-9d4b6777b-v7trz\" (UID: \"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4\") " pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.033635 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.033615 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swxps\" (UniqueName: \"kubernetes.io/projected/c32378d6-79f4-4462-a6bc-310eaafe2cac-kube-api-access-swxps\") pod \"volume-data-source-validator-7c6cbb6c87-nx84z\" (UID: \"c32378d6-79f4-4462-a6bc-310eaafe2cac\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z" Apr 24 14:25:49.124695 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.124621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49abdf75-9c98-4426-953c-83a9aa6a3869-config\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.124695 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.124683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tqlm\" (UniqueName: \"kubernetes.io/projected/49abdf75-9c98-4426-953c-83a9aa6a3869-kube-api-access-6tqlm\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.124880 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.124771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49abdf75-9c98-4426-953c-83a9aa6a3869-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.125173 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.125146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49abdf75-9c98-4426-953c-83a9aa6a3869-config\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.126742 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.126721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49abdf75-9c98-4426-953c-83a9aa6a3869-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.133262 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.133238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tqlm\" (UniqueName: \"kubernetes.io/projected/49abdf75-9c98-4426-953c-83a9aa6a3869-kube-api-access-6tqlm\") pod \"service-ca-operator-d6fc45fc5-g5f7n\" (UID: \"49abdf75-9c98-4426-953c-83a9aa6a3869\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.158049 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.158030 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z" Apr 24 14:25:49.169804 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.169786 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:49.259827 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.259803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" Apr 24 14:25:49.274452 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.274331 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z"] Apr 24 14:25:49.278515 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:25:49.278478 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32378d6_79f4_4462_a6bc_310eaafe2cac.slice/crio-f92456c830e76691ead169dc5253024e4f24253b6d64edccd99fb3015dd5de52 WatchSource:0}: Error finding container f92456c830e76691ead169dc5253024e4f24253b6d64edccd99fb3015dd5de52: Status 404 returned error can't find the container with id f92456c830e76691ead169dc5253024e4f24253b6d64edccd99fb3015dd5de52 Apr 24 14:25:49.290573 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.290255 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v7trz"] Apr 24 14:25:49.294631 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:25:49.294610 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff3b99d4_3afa_4687_b6b7_7d3526edbcf4.slice/crio-4185bd7ebb37695caf723f579340025a3c1c54fde2462357eb38766ca80e36a3 WatchSource:0}: Error finding container 4185bd7ebb37695caf723f579340025a3c1c54fde2462357eb38766ca80e36a3: Status 404 returned error can't find the container with id 4185bd7ebb37695caf723f579340025a3c1c54fde2462357eb38766ca80e36a3 Apr 24 14:25:49.373784 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.373714 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n"] Apr 24 14:25:49.377154 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:25:49.377129 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49abdf75_9c98_4426_953c_83a9aa6a3869.slice/crio-ff96f8f059d93c660d41ff25ab727fd6d252d30bfe05df262cca90bd99df44af WatchSource:0}: Error finding container ff96f8f059d93c660d41ff25ab727fd6d252d30bfe05df262cca90bd99df44af: Status 404 returned error can't find the container with id ff96f8f059d93c660d41ff25ab727fd6d252d30bfe05df262cca90bd99df44af Apr 24 14:25:49.528231 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.528200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.528341 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.528294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:49.528377 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:49.528336 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:25:49.528438 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:49.528410 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:50.52838079 +0000 UTC m=+113.740232142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : secret "router-metrics-certs-default" not found Apr 24 14:25:49.528438 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:49.528425 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:50.528418295 +0000 UTC m=+113.740269647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : configmap references non-existent config key: service-ca.crt Apr 24 14:25:49.681677 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.681596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z" event={"ID":"c32378d6-79f4-4462-a6bc-310eaafe2cac","Type":"ContainerStarted","Data":"f92456c830e76691ead169dc5253024e4f24253b6d64edccd99fb3015dd5de52"} Apr 24 14:25:49.682484 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.682465 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" event={"ID":"49abdf75-9c98-4426-953c-83a9aa6a3869","Type":"ContainerStarted","Data":"ff96f8f059d93c660d41ff25ab727fd6d252d30bfe05df262cca90bd99df44af"} Apr 24 14:25:49.683208 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:49.683179 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" event={"ID":"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4","Type":"ContainerStarted","Data":"4185bd7ebb37695caf723f579340025a3c1c54fde2462357eb38766ca80e36a3"} Apr 24 14:25:50.537746 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:50.537706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:50.538191 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:50.537767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:50.538191 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:50.537990 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:25:50.538191 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:50.538042 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:52.538016665 +0000 UTC m=+115.749868023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : configmap references non-existent config key: service-ca.crt Apr 24 14:25:50.538191 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:50.538071 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:52.538061372 +0000 UTC m=+115.749912739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : secret "router-metrics-certs-default" not found Apr 24 14:25:52.553795 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.553765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:52.553795 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.553801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:52.554190 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:52.553904 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:25:52.554190 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:52.553927 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:56.553909076 +0000 UTC m=+119.765760428 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : configmap references non-existent config key: service-ca.crt Apr 24 14:25:52.554190 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:52.553955 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:56.553942205 +0000 UTC m=+119.765793558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : secret "router-metrics-certs-default" not found Apr 24 14:25:52.694440 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.694388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z" event={"ID":"c32378d6-79f4-4462-a6bc-310eaafe2cac","Type":"ContainerStarted","Data":"0a1c75ee6ea5a1d27d1f011f59475b2a8ff345c436b00366f24ea40bd0867c73"} Apr 24 14:25:52.695759 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.695732 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" event={"ID":"49abdf75-9c98-4426-953c-83a9aa6a3869","Type":"ContainerStarted","Data":"e53d23e5acb2b51d6926ffe6c14702d341769c0a57c1da8b9b70b25ea6b03f72"} Apr 24 14:25:52.697155 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.697137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/0.log" Apr 24 14:25:52.697278 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.697166 2574 generic.go:358] "Generic (PLEG): container finished" podID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" containerID="b2c2064dae98bd071bf0c19f960a13bc11ecbfa05735a4ea9e1a657e4585913e" exitCode=255 Apr 24 14:25:52.697278 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.697192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" event={"ID":"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4","Type":"ContainerDied","Data":"b2c2064dae98bd071bf0c19f960a13bc11ecbfa05735a4ea9e1a657e4585913e"} Apr 24 14:25:52.697557 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.697540 2574 scope.go:117] "RemoveContainer" containerID="b2c2064dae98bd071bf0c19f960a13bc11ecbfa05735a4ea9e1a657e4585913e" Apr 24 14:25:52.709371 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.709331 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nx84z" podStartSLOduration=2.3350213650000002 podStartE2EDuration="4.709319346s" podCreationTimestamp="2026-04-24 14:25:48 +0000 UTC" firstStartedPulling="2026-04-24 14:25:49.280146814 +0000 UTC m=+112.491998172" lastFinishedPulling="2026-04-24 14:25:51.654444785 +0000 UTC m=+114.866296153" observedRunningTime="2026-04-24 14:25:52.708724589 +0000 UTC m=+115.920576174" watchObservedRunningTime="2026-04-24 14:25:52.709319346 +0000 UTC m=+115.921170718" Apr 24 14:25:52.721741 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:52.721707 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" podStartSLOduration=2.432712345 podStartE2EDuration="4.721696068s" podCreationTimestamp="2026-04-24 14:25:48 +0000 UTC" firstStartedPulling="2026-04-24 14:25:49.378851747 +0000 UTC m=+112.590703100" lastFinishedPulling="2026-04-24 14:25:51.667835459 +0000 UTC m=+114.879686823" observedRunningTime="2026-04-24 14:25:52.721558168 +0000 UTC m=+115.933409543" watchObservedRunningTime="2026-04-24 14:25:52.721696068 +0000 UTC m=+115.933547446" Apr 24 14:25:53.601014 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.600981 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj"] Apr 24 14:25:53.603703 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.603687 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj" Apr 24 14:25:53.605592 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.605569 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-q5ssn\"" Apr 24 14:25:53.611177 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.611156 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj"] Apr 24 14:25:53.700625 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.700604 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/1.log" Apr 24 14:25:53.700922 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.700909 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/0.log" Apr 24 14:25:53.700966 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.700943 2574 generic.go:358] "Generic (PLEG): container finished" podID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" containerID="b8180d52fb5fee177e7bc31c61399c72012404565bee6455f97575c124a8417c" exitCode=255 Apr 24 14:25:53.701041 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.701023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" event={"ID":"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4","Type":"ContainerDied","Data":"b8180d52fb5fee177e7bc31c61399c72012404565bee6455f97575c124a8417c"} Apr 24 14:25:53.701077 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.701066 2574 scope.go:117] "RemoveContainer" containerID="b2c2064dae98bd071bf0c19f960a13bc11ecbfa05735a4ea9e1a657e4585913e" Apr 24 14:25:53.701232 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.701213 2574 scope.go:117] "RemoveContainer" containerID="b8180d52fb5fee177e7bc31c61399c72012404565bee6455f97575c124a8417c" Apr 24 14:25:53.701455 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:53.701437 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v7trz_openshift-console-operator(ff3b99d4-3afa-4687-b6b7-7d3526edbcf4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" podUID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" Apr 24 14:25:53.762294 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.762263 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjfnj\" (UniqueName: \"kubernetes.io/projected/8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3-kube-api-access-qjfnj\") pod \"network-check-source-8894fc9bd-qttrj\" (UID: \"8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj" Apr 24 14:25:53.863617 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.863538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjfnj\" (UniqueName: \"kubernetes.io/projected/8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3-kube-api-access-qjfnj\") pod \"network-check-source-8894fc9bd-qttrj\" (UID: \"8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj" Apr 24 14:25:53.870663 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.870645 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjfnj\" (UniqueName: \"kubernetes.io/projected/8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3-kube-api-access-qjfnj\") pod \"network-check-source-8894fc9bd-qttrj\" (UID: \"8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj" Apr 24 14:25:53.912671 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:53.912643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj" Apr 24 14:25:54.026627 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:54.026596 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj"] Apr 24 14:25:54.030281 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:25:54.030257 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c41cff4_707d_4fea_a2c7_1c8e2bc39fb3.slice/crio-2e22e9c84e2d7596272266b36033257468be6a878eb6342635c3d84541a6557a WatchSource:0}: Error finding container 2e22e9c84e2d7596272266b36033257468be6a878eb6342635c3d84541a6557a: Status 404 returned error can't find the container with id 2e22e9c84e2d7596272266b36033257468be6a878eb6342635c3d84541a6557a Apr 24 14:25:54.559672 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:54.559642 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p8qh5_44a82b31-abfc-4f70-a1e3-54ed41d48cf7/dns-node-resolver/0.log" Apr 24 14:25:54.704090 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:54.704064 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/1.log" Apr 24 14:25:54.704543 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:54.704389 2574 scope.go:117] "RemoveContainer" containerID="b8180d52fb5fee177e7bc31c61399c72012404565bee6455f97575c124a8417c" Apr 24 14:25:54.704620 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:54.704602 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v7trz_openshift-console-operator(ff3b99d4-3afa-4687-b6b7-7d3526edbcf4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" podUID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" Apr 24 14:25:54.705514 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:54.705490 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj" event={"ID":"8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3","Type":"ContainerStarted","Data":"e8129786d88fdb6af7008f2c6f7e46757cc9718bb0d474b3bf033875bc17e0f7"} Apr 24 14:25:54.705615 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:54.705522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj" event={"ID":"8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3","Type":"ContainerStarted","Data":"2e22e9c84e2d7596272266b36033257468be6a878eb6342635c3d84541a6557a"} Apr 24 14:25:54.730713 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:54.730673 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-qttrj" podStartSLOduration=1.730660026 podStartE2EDuration="1.730660026s" podCreationTimestamp="2026-04-24 14:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:25:54.730519472 +0000 UTC m=+117.942370851" watchObservedRunningTime="2026-04-24 14:25:54.730660026 +0000 UTC m=+117.942511400" Apr 24 14:25:55.157854 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:55.157828 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-95f8z_52546bac-718f-4f97-8b34-9a2e8efca7e8/node-ca/0.log" Apr 24 14:25:56.580041 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:56.579987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:56.580041 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:56.580048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:25:56.580505 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:56.580154 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 14:25:56.580505 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:56.580169 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:04.580149161 +0000 UTC m=+127.792000520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : configmap references non-existent config key: service-ca.crt Apr 24 14:25:56.580505 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:56.580191 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:04.580179946 +0000 UTC m=+127.792031299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : secret "router-metrics-certs-default" not found Apr 24 14:25:59.170118 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:59.170074 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:59.170118 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:59.170119 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:25:59.170556 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:25:59.170470 2574 scope.go:117] "RemoveContainer" containerID="b8180d52fb5fee177e7bc31c61399c72012404565bee6455f97575c124a8417c" Apr 24 14:25:59.170635 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:25:59.170617 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v7trz_openshift-console-operator(ff3b99d4-3afa-4687-b6b7-7d3526edbcf4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" podUID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" Apr 24 14:26:04.642203 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:04.642159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:04.642203 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:04.642206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:04.642696 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:04.642331 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle podName:8a1f01af-d685-4103-bebf-0d55fcb83c35 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:20.642312662 +0000 UTC m=+143.854164014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle") pod "router-default-577fb5f5fd-t2ghs" (UID: "8a1f01af-d685-4103-bebf-0d55fcb83c35") : configmap references non-existent config key: service-ca.crt Apr 24 14:26:04.644585 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:04.644569 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a1f01af-d685-4103-bebf-0d55fcb83c35-metrics-certs\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:07.059902 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:07.059869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:26:07.062195 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:07.062171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a216968f-e7d3-4145-b877-dbf4cfe8277a-metrics-certs\") pod \"network-metrics-daemon-n65kf\" (UID: \"a216968f-e7d3-4145-b877-dbf4cfe8277a\") " pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:26:07.074337 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:07.074312 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68j6s\"" Apr 24 14:26:07.082867 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:07.082849 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n65kf" Apr 24 14:26:07.197015 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:07.196984 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n65kf"] Apr 24 14:26:07.200564 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:07.200539 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda216968f_e7d3_4145_b877_dbf4cfe8277a.slice/crio-d3eaa44213bd18a7a1049e22ee330131662d0196a0aecc28c8e4118d2a07ceb0 WatchSource:0}: Error finding container d3eaa44213bd18a7a1049e22ee330131662d0196a0aecc28c8e4118d2a07ceb0: Status 404 returned error can't find the container with id d3eaa44213bd18a7a1049e22ee330131662d0196a0aecc28c8e4118d2a07ceb0 Apr 24 14:26:07.733988 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:07.733935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n65kf" event={"ID":"a216968f-e7d3-4145-b877-dbf4cfe8277a","Type":"ContainerStarted","Data":"d3eaa44213bd18a7a1049e22ee330131662d0196a0aecc28c8e4118d2a07ceb0"} Apr 24 14:26:08.737725 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:08.737649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n65kf" event={"ID":"a216968f-e7d3-4145-b877-dbf4cfe8277a","Type":"ContainerStarted","Data":"8ea4279c6f40e0346378f4b7fb1330c911d3b8a3f72f5f12a295840628da77b0"} Apr 24 14:26:08.737725 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:08.737682 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n65kf" event={"ID":"a216968f-e7d3-4145-b877-dbf4cfe8277a","Type":"ContainerStarted","Data":"d4f6be8ae6356fc49ffb5b8786b07e1379c01c6a233c3be5484ce61221e35c6c"} Apr 24 14:26:08.751998 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:08.751943 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n65kf" podStartSLOduration=130.535959998 podStartE2EDuration="2m11.751930457s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:26:07.202389852 +0000 UTC m=+130.414241206" lastFinishedPulling="2026-04-24 14:26:08.418360311 +0000 UTC m=+131.630211665" observedRunningTime="2026-04-24 14:26:08.751328997 +0000 UTC m=+131.963180372" watchObservedRunningTime="2026-04-24 14:26:08.751930457 +0000 UTC m=+131.963781831" Apr 24 14:26:13.354955 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:13.354925 2574 scope.go:117] "RemoveContainer" containerID="b8180d52fb5fee177e7bc31c61399c72012404565bee6455f97575c124a8417c" Apr 24 14:26:13.753988 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:13.753902 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:26:13.754290 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:13.754274 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/1.log" Apr 24 14:26:13.754346 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:13.754306 2574 generic.go:358] "Generic (PLEG): container finished" podID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" containerID="14fb7deabd822e761fcf64962b9b639c4d3d83a4ed608e5554d91e2d65c02809" exitCode=255 Apr 24 14:26:13.754381 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:13.754358 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" event={"ID":"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4","Type":"ContainerDied","Data":"14fb7deabd822e761fcf64962b9b639c4d3d83a4ed608e5554d91e2d65c02809"} Apr 24 14:26:13.754431 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:13.754388 2574 scope.go:117] "RemoveContainer" containerID="b8180d52fb5fee177e7bc31c61399c72012404565bee6455f97575c124a8417c" Apr 24 14:26:13.754721 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:13.754703 2574 scope.go:117] "RemoveContainer" containerID="14fb7deabd822e761fcf64962b9b639c4d3d83a4ed608e5554d91e2d65c02809" Apr 24 14:26:13.754899 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:13.754876 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-v7trz_openshift-console-operator(ff3b99d4-3afa-4687-b6b7-7d3526edbcf4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" podUID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" Apr 24 14:26:14.757604 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:14.757572 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:26:17.132592 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.132563 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2"] Apr 24 14:26:17.135385 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.135368 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" Apr 24 14:26:17.137705 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.137686 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 14:26:17.138269 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.138251 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 14:26:17.138347 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.138251 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-cfmj8\"" Apr 24 14:26:17.144316 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.144293 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2"] Apr 24 14:26:17.173141 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.173117 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q4tm5"] Apr 24 14:26:17.176134 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.176120 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.178231 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.178214 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jk68x\"" Apr 24 14:26:17.178596 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.178580 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:26:17.178653 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.178584 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:26:17.178885 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.178868 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:26:17.178885 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.178876 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:26:17.189078 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.189058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q4tm5"] Apr 24 14:26:17.236261 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.236225 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ad0da81-ab22-438d-911a-36e1a74dba1f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-hwdq2\" (UID: \"0ad0da81-ab22-438d-911a-36e1a74dba1f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" Apr 24 14:26:17.236261 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.236258 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0ad0da81-ab22-438d-911a-36e1a74dba1f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-hwdq2\" (UID: \"0ad0da81-ab22-438d-911a-36e1a74dba1f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" Apr 24 14:26:17.242461 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.242441 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-665d6b54df-ts7mn"] Apr 24 14:26:17.246085 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.246057 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.250353 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.250334 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 14:26:17.250485 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.250462 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xtmb2\"" Apr 24 14:26:17.250591 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.250571 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 14:26:17.250642 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.250576 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 14:26:17.258819 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.258803 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 14:26:17.269321 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.269301 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-665d6b54df-ts7mn"] Apr 24 14:26:17.293388 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.293364 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-665d6b54df-ts7mn"] Apr 24 14:26:17.293569 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:17.293550 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-x6bk4 registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" podUID="a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" Apr 24 14:26:17.336930 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.336901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.337030 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.336934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6bk4\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-kube-api-access-x6bk4\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.337030 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.336962 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ad0da81-ab22-438d-911a-36e1a74dba1f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-hwdq2\" (UID: \"0ad0da81-ab22-438d-911a-36e1a74dba1f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" Apr 24 14:26:17.337030 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.336994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.337030 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337018 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-installation-pull-secrets\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.337169 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337159 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-image-registry-private-configuration\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.337204 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337187 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-ca-trust-extracted\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.337247 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-crio-socket\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.337247 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337238 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-trusted-ca\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.337329 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0ad0da81-ab22-438d-911a-36e1a74dba1f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-hwdq2\" (UID: \"0ad0da81-ab22-438d-911a-36e1a74dba1f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" Apr 24 14:26:17.337329 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-certificates\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.337514 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337353 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-tls\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.337514 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337417 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-bound-sa-token\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.337514 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj9q4\" (UniqueName: \"kubernetes.io/projected/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-kube-api-access-lj9q4\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.337514 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337462 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-data-volume\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.337954 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.337935 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0ad0da81-ab22-438d-911a-36e1a74dba1f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-hwdq2\" (UID: \"0ad0da81-ab22-438d-911a-36e1a74dba1f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" Apr 24 14:26:17.339348 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.339329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ad0da81-ab22-438d-911a-36e1a74dba1f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-hwdq2\" (UID: \"0ad0da81-ab22-438d-911a-36e1a74dba1f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" Apr 24 14:26:17.438102 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-image-registry-private-configuration\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438102 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-ca-trust-extracted\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438102 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-crio-socket\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.438336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-trusted-ca\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-certificates\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-tls\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-bound-sa-token\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj9q4\" (UniqueName: \"kubernetes.io/projected/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-kube-api-access-lj9q4\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.438336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-data-volume\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.438336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-crio-socket\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.438336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438232 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.438739 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6bk4\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-kube-api-access-x6bk4\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438739 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438537 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-ca-trust-extracted\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438739 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.438739 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-installation-pull-secrets\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.438739 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.438606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-data-volume\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.439172 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.439148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.439561 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.439538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-certificates\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.439711 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.439693 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-trusted-ca\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.440552 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.440531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-image-registry-private-configuration\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.441272 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.441244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-installation-pull-secrets\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.441417 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.441244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.441610 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.441594 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-tls\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.443426 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.443409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" Apr 24 14:26:17.445926 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.445907 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6bk4\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-kube-api-access-x6bk4\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.446272 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.446249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj9q4\" (UniqueName: \"kubernetes.io/projected/d7b5bf76-d52c-41a2-8bd9-53cbd963751d-kube-api-access-lj9q4\") pod \"insights-runtime-extractor-q4tm5\" (UID: \"d7b5bf76-d52c-41a2-8bd9-53cbd963751d\") " pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.446370 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.446315 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-bound-sa-token\") pod \"image-registry-665d6b54df-ts7mn\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.484293 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.484263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q4tm5" Apr 24 14:26:17.572614 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.572563 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2"] Apr 24 14:26:17.577894 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:17.577869 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad0da81_ab22_438d_911a_36e1a74dba1f.slice/crio-c0e97c5344968ec7544641ed31db08a7c985686c7d850c0adec7020fb17e974b WatchSource:0}: Error finding container c0e97c5344968ec7544641ed31db08a7c985686c7d850c0adec7020fb17e974b: Status 404 returned error can't find the container with id c0e97c5344968ec7544641ed31db08a7c985686c7d850c0adec7020fb17e974b Apr 24 14:26:17.620568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.620541 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q4tm5"] Apr 24 14:26:17.623028 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:17.622992 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b5bf76_d52c_41a2_8bd9_53cbd963751d.slice/crio-a89bef546e583c85ca8d4ae2977d3ee032df3cc20bf0304f3f329b9285ba61f0 WatchSource:0}: Error finding container a89bef546e583c85ca8d4ae2977d3ee032df3cc20bf0304f3f329b9285ba61f0: Status 404 returned error can't find the container with id a89bef546e583c85ca8d4ae2977d3ee032df3cc20bf0304f3f329b9285ba61f0 Apr 24 14:26:17.766592 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.766499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4tm5" event={"ID":"d7b5bf76-d52c-41a2-8bd9-53cbd963751d","Type":"ContainerStarted","Data":"0e39a0f3440f210c3230b39f7069f25579d500fbf95085f7d15e9a209a92e332"} Apr 24 14:26:17.766592 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.766541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4tm5" event={"ID":"d7b5bf76-d52c-41a2-8bd9-53cbd963751d","Type":"ContainerStarted","Data":"a89bef546e583c85ca8d4ae2977d3ee032df3cc20bf0304f3f329b9285ba61f0"} Apr 24 14:26:17.767455 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.767430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" event={"ID":"0ad0da81-ab22-438d-911a-36e1a74dba1f","Type":"ContainerStarted","Data":"c0e97c5344968ec7544641ed31db08a7c985686c7d850c0adec7020fb17e974b"} Apr 24 14:26:17.767455 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.767444 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.771532 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.771514 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:17.941358 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.941328 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-bound-sa-token\") pod \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " Apr 24 14:26:17.941528 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.941385 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-installation-pull-secrets\") pod \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " Apr 24 14:26:17.941528 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.941430 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6bk4\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-kube-api-access-x6bk4\") pod \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " Apr 24 14:26:17.941528 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.941447 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-trusted-ca\") pod \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " Apr 24 14:26:17.941528 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.941462 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-certificates\") pod \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " Apr 24 14:26:17.941528 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.941478 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-tls\") pod \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " Apr 24 14:26:17.941528 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.941508 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-image-registry-private-configuration\") pod \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " Apr 24 14:26:17.941729 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.941536 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-ca-trust-extracted\") pod \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\" (UID: \"a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9\") " Apr 24 14:26:17.943405 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.943374 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" (UID: "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:26:17.944888 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.944849 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" (UID: "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:26:17.944888 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.944870 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" (UID: "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:17.944888 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.944877 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" (UID: "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:26:17.946258 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.946238 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-kube-api-access-x6bk4" (OuterVolumeSpecName: "kube-api-access-x6bk4") pod "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" (UID: "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9"). InnerVolumeSpecName "kube-api-access-x6bk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:26:17.946387 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.946361 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" (UID: "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:17.946771 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.946754 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" (UID: "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:26:17.946834 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:17.946770 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" (UID: "a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:26:18.042001 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.041925 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6bk4\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-kube-api-access-x6bk4\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:26:18.042001 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.041950 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-trusted-ca\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:26:18.042001 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.041970 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-certificates\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:26:18.042001 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.041991 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-registry-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:26:18.042202 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.042006 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-image-registry-private-configuration\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:26:18.042202 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.042020 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-ca-trust-extracted\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:26:18.042202 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.042032 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-bound-sa-token\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:26:18.042202 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.042045 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9-installation-pull-secrets\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:26:18.771515 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.771479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" event={"ID":"0ad0da81-ab22-438d-911a-36e1a74dba1f","Type":"ContainerStarted","Data":"caf89ab8f72c0961dae0f0f9d73ce196ed6524a081312d2858f83d8622cc7ec7"} Apr 24 14:26:18.772911 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.772889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665d6b54df-ts7mn" Apr 24 14:26:18.772911 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.772897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4tm5" event={"ID":"d7b5bf76-d52c-41a2-8bd9-53cbd963751d","Type":"ContainerStarted","Data":"909c0d81461a4d26309828d1df66ba8f7638d22de6a776b1b3890d7be45b7a42"} Apr 24 14:26:18.786001 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.785955 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-hwdq2" podStartSLOduration=0.69911402 podStartE2EDuration="1.785944142s" podCreationTimestamp="2026-04-24 14:26:17 +0000 UTC" firstStartedPulling="2026-04-24 14:26:17.579839585 +0000 UTC m=+140.791690941" lastFinishedPulling="2026-04-24 14:26:18.666669711 +0000 UTC m=+141.878521063" observedRunningTime="2026-04-24 14:26:18.784697352 +0000 UTC m=+141.996548727" watchObservedRunningTime="2026-04-24 14:26:18.785944142 +0000 UTC m=+141.997795517" Apr 24 14:26:18.808491 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.808459 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-665d6b54df-ts7mn"] Apr 24 14:26:18.814239 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:18.814220 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-665d6b54df-ts7mn"] Apr 24 14:26:19.170281 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:19.170215 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:26:19.170281 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:19.170263 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:26:19.170726 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:19.170706 2574 scope.go:117] "RemoveContainer" containerID="14fb7deabd822e761fcf64962b9b639c4d3d83a4ed608e5554d91e2d65c02809" Apr 24 14:26:19.170928 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:19.170907 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-v7trz_openshift-console-operator(ff3b99d4-3afa-4687-b6b7-7d3526edbcf4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" podUID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" Apr 24 14:26:19.358899 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:19.358864 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9" path="/var/lib/kubelet/pods/a5303b1e-36e0-4fdc-99f2-0ef6f5db92d9/volumes" Apr 24 14:26:20.661676 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:20.661625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:20.665018 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:20.662681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1f01af-d685-4103-bebf-0d55fcb83c35-service-ca-bundle\") pod \"router-default-577fb5f5fd-t2ghs\" (UID: \"8a1f01af-d685-4103-bebf-0d55fcb83c35\") " pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:20.667325 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:20.667305 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nmdvk\"" Apr 24 14:26:20.675966 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:20.675949 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:20.780296 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:20.780266 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4tm5" event={"ID":"d7b5bf76-d52c-41a2-8bd9-53cbd963751d","Type":"ContainerStarted","Data":"4cd6fb6f4eed8edb443039e15b41820ca3363aff02079a1ba1e9c959af70ea81"} Apr 24 14:26:20.795703 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:20.795649 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q4tm5" podStartSLOduration=1.632464817 podStartE2EDuration="3.795636663s" podCreationTimestamp="2026-04-24 14:26:17 +0000 UTC" firstStartedPulling="2026-04-24 14:26:17.678738276 +0000 UTC m=+140.890589628" lastFinishedPulling="2026-04-24 14:26:19.841910119 +0000 UTC m=+143.053761474" observedRunningTime="2026-04-24 14:26:20.795357667 +0000 UTC m=+144.007209042" watchObservedRunningTime="2026-04-24 14:26:20.795636663 +0000 UTC m=+144.007488038" Apr 24 14:26:20.812092 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:20.811664 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-577fb5f5fd-t2ghs"] Apr 24 14:26:20.815671 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:20.815649 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a1f01af_d685_4103_bebf_0d55fcb83c35.slice/crio-371209acfd2086e3d4f8f7c522c7407d22b6aac3ba56338a498faadad75f9aa8 WatchSource:0}: Error finding container 371209acfd2086e3d4f8f7c522c7407d22b6aac3ba56338a498faadad75f9aa8: Status 404 returned error can't find the container with id 371209acfd2086e3d4f8f7c522c7407d22b6aac3ba56338a498faadad75f9aa8 Apr 24 14:26:21.784298 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:21.784264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" event={"ID":"8a1f01af-d685-4103-bebf-0d55fcb83c35","Type":"ContainerStarted","Data":"bb6e8f65fb0b7fce1b54007d7710c7ba8b1dcd7083067dcf3f1623f7b7b2c40d"} Apr 24 14:26:21.784742 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:21.784306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" event={"ID":"8a1f01af-d685-4103-bebf-0d55fcb83c35","Type":"ContainerStarted","Data":"371209acfd2086e3d4f8f7c522c7407d22b6aac3ba56338a498faadad75f9aa8"} Apr 24 14:26:21.800758 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:21.800710 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" podStartSLOduration=33.800694175 podStartE2EDuration="33.800694175s" podCreationTimestamp="2026-04-24 14:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:21.799600186 +0000 UTC m=+145.011451560" watchObservedRunningTime="2026-04-24 14:26:21.800694175 +0000 UTC m=+145.012545551" Apr 24 14:26:22.677060 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:22.677023 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:22.679613 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:22.679592 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:22.787065 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:22.787035 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:22.788137 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:22.788115 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-577fb5f5fd-t2ghs" Apr 24 14:26:23.105984 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.105951 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl"] Apr 24 14:26:23.109988 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.109972 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" Apr 24 14:26:23.112546 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.112527 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 14:26:23.112695 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.112679 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-l5n8f\"" Apr 24 14:26:23.118448 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.118428 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl"] Apr 24 14:26:23.280958 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.280921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2a817927-9d20-4e56-a0bf-0223603b5b85-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-z2grl\" (UID: \"2a817927-9d20-4e56-a0bf-0223603b5b85\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" Apr 24 14:26:23.381384 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.381312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2a817927-9d20-4e56-a0bf-0223603b5b85-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-z2grl\" (UID: \"2a817927-9d20-4e56-a0bf-0223603b5b85\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" Apr 24 14:26:23.381535 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:23.381432 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 24 14:26:23.381535 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:23.381481 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a817927-9d20-4e56-a0bf-0223603b5b85-tls-certificates podName:2a817927-9d20-4e56-a0bf-0223603b5b85 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:23.881467582 +0000 UTC m=+147.093318935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/2a817927-9d20-4e56-a0bf-0223603b5b85-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-z2grl" (UID: "2a817927-9d20-4e56-a0bf-0223603b5b85") : secret "prometheus-operator-admission-webhook-tls" not found Apr 24 14:26:23.885015 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.884981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2a817927-9d20-4e56-a0bf-0223603b5b85-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-z2grl\" (UID: \"2a817927-9d20-4e56-a0bf-0223603b5b85\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" Apr 24 14:26:23.887406 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:23.887377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/2a817927-9d20-4e56-a0bf-0223603b5b85-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-z2grl\" (UID: \"2a817927-9d20-4e56-a0bf-0223603b5b85\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" Apr 24 14:26:24.018095 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:24.018052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" Apr 24 14:26:24.130637 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:24.130604 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl"] Apr 24 14:26:24.133829 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:24.133799 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a817927_9d20_4e56_a0bf_0223603b5b85.slice/crio-67e362e2e78cd45ccf70ab09710d5d3f912dd25d592511acd1025a9c85bada81 WatchSource:0}: Error finding container 67e362e2e78cd45ccf70ab09710d5d3f912dd25d592511acd1025a9c85bada81: Status 404 returned error can't find the container with id 67e362e2e78cd45ccf70ab09710d5d3f912dd25d592511acd1025a9c85bada81 Apr 24 14:26:24.796889 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:24.796852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" event={"ID":"2a817927-9d20-4e56-a0bf-0223603b5b85","Type":"ContainerStarted","Data":"67e362e2e78cd45ccf70ab09710d5d3f912dd25d592511acd1025a9c85bada81"} Apr 24 14:26:25.801001 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:25.800964 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" event={"ID":"2a817927-9d20-4e56-a0bf-0223603b5b85","Type":"ContainerStarted","Data":"b5f65a39d7de0243654f41a88eb551aeaa36009c388e7c27bb77b636b7c67286"} Apr 24 14:26:25.801437 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:25.801153 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" Apr 24 14:26:25.806069 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:25.806044 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" Apr 24 14:26:25.815121 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:25.815084 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z2grl" podStartSLOduration=1.67937051 podStartE2EDuration="2.815069069s" podCreationTimestamp="2026-04-24 14:26:23 +0000 UTC" firstStartedPulling="2026-04-24 14:26:24.135740578 +0000 UTC m=+147.347591932" lastFinishedPulling="2026-04-24 14:26:25.271439138 +0000 UTC m=+148.483290491" observedRunningTime="2026-04-24 14:26:25.814590401 +0000 UTC m=+149.026441786" watchObservedRunningTime="2026-04-24 14:26:25.815069069 +0000 UTC m=+149.026920445" Apr 24 14:26:26.174022 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.173981 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ls5jw"] Apr 24 14:26:26.205238 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.205212 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ls5jw"] Apr 24 14:26:26.205366 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.205270 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.207339 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.207318 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:26:26.207474 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.207439 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:26:26.207553 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.207538 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-wpqsg\"" Apr 24 14:26:26.207991 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.207965 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 14:26:26.208096 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.207996 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 14:26:26.208096 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.208011 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:26:26.301561 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.301521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.301693 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.301579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.301693 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.301609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.301693 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.301681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4dl6\" (UniqueName: \"kubernetes.io/projected/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-kube-api-access-h4dl6\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.401981 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.401944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.401981 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.401991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.402156 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.402023 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.402156 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.402085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4dl6\" (UniqueName: \"kubernetes.io/projected/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-kube-api-access-h4dl6\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.402765 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.402743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.404509 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.404483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.404636 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.404617 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.409899 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.409878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4dl6\" (UniqueName: \"kubernetes.io/projected/9374d6dc-31b7-464b-a614-4cd5ce83fdbb-kube-api-access-h4dl6\") pod \"prometheus-operator-5676c8c784-ls5jw\" (UID: \"9374d6dc-31b7-464b-a614-4cd5ce83fdbb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.514549 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.514485 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" Apr 24 14:26:26.626139 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.626104 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ls5jw"] Apr 24 14:26:26.630424 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:26.630368 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9374d6dc_31b7_464b_a614_4cd5ce83fdbb.slice/crio-17a7da11cbcceecd0c93ab6a071c93ca4d8d32255fb96f86c6d1788ca2496a71 WatchSource:0}: Error finding container 17a7da11cbcceecd0c93ab6a071c93ca4d8d32255fb96f86c6d1788ca2496a71: Status 404 returned error can't find the container with id 17a7da11cbcceecd0c93ab6a071c93ca4d8d32255fb96f86c6d1788ca2496a71 Apr 24 14:26:26.804379 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:26.804349 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" event={"ID":"9374d6dc-31b7-464b-a614-4cd5ce83fdbb","Type":"ContainerStarted","Data":"17a7da11cbcceecd0c93ab6a071c93ca4d8d32255fb96f86c6d1788ca2496a71"} Apr 24 14:26:28.810583 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:28.810551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" event={"ID":"9374d6dc-31b7-464b-a614-4cd5ce83fdbb","Type":"ContainerStarted","Data":"ce6321e3c835ced293eb810fc97bcc8d42f1dd8de15c7c4d2b791439d1262435"} Apr 24 14:26:28.810583 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:28.810586 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" event={"ID":"9374d6dc-31b7-464b-a614-4cd5ce83fdbb","Type":"ContainerStarted","Data":"c1ab9342906296731b4d5ec991dd88c1f279cd33e85d74715c81a4da9f719444"} Apr 24 14:26:28.829007 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:28.828958 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-ls5jw" podStartSLOduration=1.6296356950000002 podStartE2EDuration="2.828943967s" podCreationTimestamp="2026-04-24 14:26:26 +0000 UTC" firstStartedPulling="2026-04-24 14:26:26.632253544 +0000 UTC m=+149.844104898" lastFinishedPulling="2026-04-24 14:26:27.831561815 +0000 UTC m=+151.043413170" observedRunningTime="2026-04-24 14:26:28.8275243 +0000 UTC m=+152.039375679" watchObservedRunningTime="2026-04-24 14:26:28.828943967 +0000 UTC m=+152.040795342" Apr 24 14:26:30.355142 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.355114 2574 scope.go:117] "RemoveContainer" containerID="14fb7deabd822e761fcf64962b9b639c4d3d83a4ed608e5554d91e2d65c02809" Apr 24 14:26:30.355613 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:30.355286 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-v7trz_openshift-console-operator(ff3b99d4-3afa-4687-b6b7-7d3526edbcf4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" podUID="ff3b99d4-3afa-4687-b6b7-7d3526edbcf4" Apr 24 14:26:30.505344 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.505307 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs"] Apr 24 14:26:30.508609 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.508593 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.510631 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.510610 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 14:26:30.510743 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.510651 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-dct9g\"" Apr 24 14:26:30.510743 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.510655 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 14:26:30.521157 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.521135 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs"] Apr 24 14:26:30.524612 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.524594 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-drv2m"] Apr 24 14:26:30.526615 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.526588 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca3432a7-7fcd-4793-933f-b84d886dc761-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.526722 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.526703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca3432a7-7fcd-4793-933f-b84d886dc761-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.526791 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.526735 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca3432a7-7fcd-4793-933f-b84d886dc761-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.526791 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.526756 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cm4m\" (UniqueName: \"kubernetes.io/projected/ca3432a7-7fcd-4793-933f-b84d886dc761-kube-api-access-4cm4m\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.527513 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.527498 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.534264 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.534244 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 14:26:30.535433 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.535417 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 14:26:30.535516 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.535445 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 14:26:30.535721 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.535708 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-xs84f\"" Apr 24 14:26:30.542838 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.542820 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xrfbk"] Apr 24 14:26:30.546663 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.546644 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-drv2m"] Apr 24 14:26:30.546838 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.546813 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.550295 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.550274 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8xq7d\"" Apr 24 14:26:30.550512 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.550495 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:26:30.550598 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.550544 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:26:30.557778 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.557762 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:26:30.627568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-root\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.627568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-textfile\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.627568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.627768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627597 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.627768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-wtmp\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.627768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627659 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7d61f999-ef5b-4a64-b56f-54f94755779c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.627768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.627768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-metrics-client-ca\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.627768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca3432a7-7fcd-4793-933f-b84d886dc761-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.627768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca3432a7-7fcd-4793-933f-b84d886dc761-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.628074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cm4m\" (UniqueName: \"kubernetes.io/projected/ca3432a7-7fcd-4793-933f-b84d886dc761-kube-api-access-4cm4m\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.628074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627917 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8hw\" (UniqueName: \"kubernetes.io/projected/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-kube-api-access-7k8hw\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.628074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.627945 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.628074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.628015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tpc2\" (UniqueName: \"kubernetes.io/projected/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-api-access-7tpc2\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.628074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.628049 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-sys\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.628265 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.628090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca3432a7-7fcd-4793-933f-b84d886dc761-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.628265 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.628114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.628265 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.628170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d61f999-ef5b-4a64-b56f-54f94755779c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.628265 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.628194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-tls\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.628561 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.628538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca3432a7-7fcd-4793-933f-b84d886dc761-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.630139 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.630108 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ca3432a7-7fcd-4793-933f-b84d886dc761-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.630286 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.630267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca3432a7-7fcd-4793-933f-b84d886dc761-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.644646 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.644624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cm4m\" (UniqueName: \"kubernetes.io/projected/ca3432a7-7fcd-4793-933f-b84d886dc761-kube-api-access-4cm4m\") pod \"openshift-state-metrics-9d44df66c-t8fcs\" (UID: \"ca3432a7-7fcd-4793-933f-b84d886dc761\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.728569 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7d61f999-ef5b-4a64-b56f-54f94755779c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.728569 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728571 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.728777 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-metrics-client-ca\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.728777 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8hw\" (UniqueName: \"kubernetes.io/projected/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-kube-api-access-7k8hw\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.728777 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.728777 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tpc2\" (UniqueName: \"kubernetes.io/projected/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-api-access-7tpc2\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.728777 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-sys\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.728777 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d61f999-ef5b-4a64-b56f-54f94755779c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-tls\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-root\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-textfile\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-root\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7d61f999-ef5b-4a64-b56f-54f94755779c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:30.729006 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.729021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-wtmp\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729074 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:30.729083 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-tls podName:990c1e6d-4603-492a-b0d1-b0d498ef3c6e nodeName:}" failed. No retries permitted until 2026-04-24 14:26:31.229066169 +0000 UTC m=+154.440917522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-tls") pod "node-exporter-xrfbk" (UID: "990c1e6d-4603-492a-b0d1-b0d498ef3c6e") : secret "node-exporter-tls" not found Apr 24 14:26:30.729704 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.729170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-wtmp\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729704 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.728875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-sys\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729704 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.729566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-textfile\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.729704 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.729568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d61f999-ef5b-4a64-b56f-54f94755779c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.729904 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.729830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.729904 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.729858 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-metrics-client-ca\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.730069 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.730049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.731325 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.731299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.731594 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.731574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.731635 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.731578 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.738621 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.738603 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8hw\" (UniqueName: \"kubernetes.io/projected/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-kube-api-access-7k8hw\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:30.739280 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.739257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tpc2\" (UniqueName: \"kubernetes.io/projected/7d61f999-ef5b-4a64-b56f-54f94755779c-kube-api-access-7tpc2\") pod \"kube-state-metrics-69db897b98-drv2m\" (UID: \"7d61f999-ef5b-4a64-b56f-54f94755779c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.817041 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.817016 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" Apr 24 14:26:30.835889 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.835867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" Apr 24 14:26:30.941696 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.941669 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs"] Apr 24 14:26:30.944541 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:30.944500 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca3432a7_7fcd_4793_933f_b84d886dc761.slice/crio-3066b440a05e00fc698f55d0f3fdfa53136f665768fea8d84816fe0dcdd42a23 WatchSource:0}: Error finding container 3066b440a05e00fc698f55d0f3fdfa53136f665768fea8d84816fe0dcdd42a23: Status 404 returned error can't find the container with id 3066b440a05e00fc698f55d0f3fdfa53136f665768fea8d84816fe0dcdd42a23 Apr 24 14:26:30.958356 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:30.958330 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-drv2m"] Apr 24 14:26:30.960880 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:30.960858 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d61f999_ef5b_4a64_b56f_54f94755779c.slice/crio-547ff8c9657c3cb113153cfb1816ed9bfe840f41bf4859c481fcf8cfd08582a1 WatchSource:0}: Error finding container 547ff8c9657c3cb113153cfb1816ed9bfe840f41bf4859c481fcf8cfd08582a1: Status 404 returned error can't find the container with id 547ff8c9657c3cb113153cfb1816ed9bfe840f41bf4859c481fcf8cfd08582a1 Apr 24 14:26:31.233541 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.233454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-tls\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:31.235692 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.235673 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/990c1e6d-4603-492a-b0d1-b0d498ef3c6e-node-exporter-tls\") pod \"node-exporter-xrfbk\" (UID: \"990c1e6d-4603-492a-b0d1-b0d498ef3c6e\") " pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:31.455244 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.455214 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xrfbk" Apr 24 14:26:31.463013 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:31.462980 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod990c1e6d_4603_492a_b0d1_b0d498ef3c6e.slice/crio-495f0ad4e1ecc1a2c502a013915cbd9519c18b0a7b0840a9e3ef332ecf3a1fd7 WatchSource:0}: Error finding container 495f0ad4e1ecc1a2c502a013915cbd9519c18b0a7b0840a9e3ef332ecf3a1fd7: Status 404 returned error can't find the container with id 495f0ad4e1ecc1a2c502a013915cbd9519c18b0a7b0840a9e3ef332ecf3a1fd7 Apr 24 14:26:31.607242 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.607106 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:31.611407 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.611372 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.613462 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613432 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 14:26:31.613571 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613475 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 14:26:31.613571 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613486 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-876pr\"" Apr 24 14:26:31.613571 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 14:26:31.613571 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613560 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 14:26:31.613770 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613643 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 14:26:31.613770 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613652 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 14:26:31.613770 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613434 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 14:26:31.613770 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613706 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 14:26:31.613986 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.613970 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 14:26:31.624773 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.624749 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:31.636640 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636611 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.636768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.636768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-web-config\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.636768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636753 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-volume\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.636947 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.636947 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.636947 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.636947 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636872 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.636947 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-out\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.637186 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.636983 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.637186 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.637036 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.637186 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.637070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpw7\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-kube-api-access-dfpw7\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.637186 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.637139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.737627 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.737840 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.737840 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-web-config\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.737840 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-volume\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.737840 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.737840 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.737840 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.738581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737845 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.738581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-out\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.738581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.738581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.737975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.738581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.738010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpw7\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-kube-api-access-dfpw7\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.738581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.738046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.738955 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:31.738617 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle podName:7503d8e2-7236-48d2-b4d5-1cd9cdc2da28 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:32.238595004 +0000 UTC m=+155.450446361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28") : configmap references non-existent config key: ca-bundle.crt Apr 24 14:26:31.739750 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.739723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.740268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.739909 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.742875 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.742828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.743857 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.743389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-web-config\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.743857 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.743491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.743857 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.743820 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.746295 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.746261 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.746886 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.746729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-volume\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.746886 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.746814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.747518 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.747269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.748010 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.747951 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpw7\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-kube-api-access-dfpw7\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.748335 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.748297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-out\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:31.820799 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.820738 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xrfbk" event={"ID":"990c1e6d-4603-492a-b0d1-b0d498ef3c6e","Type":"ContainerStarted","Data":"495f0ad4e1ecc1a2c502a013915cbd9519c18b0a7b0840a9e3ef332ecf3a1fd7"} Apr 24 14:26:31.822370 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.822338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" event={"ID":"7d61f999-ef5b-4a64-b56f-54f94755779c","Type":"ContainerStarted","Data":"547ff8c9657c3cb113153cfb1816ed9bfe840f41bf4859c481fcf8cfd08582a1"} Apr 24 14:26:31.824960 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.824873 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" event={"ID":"ca3432a7-7fcd-4793-933f-b84d886dc761","Type":"ContainerStarted","Data":"18ae05a1770a6af6439057d52cbe3ba860528d8eb43db8435e30a94fc1b7f54e"} Apr 24 14:26:31.824960 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.824913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" event={"ID":"ca3432a7-7fcd-4793-933f-b84d886dc761","Type":"ContainerStarted","Data":"d3a2a1af64a225ffd8e53ba902f25165bd154a58d54de0672f89b40efdff19ac"} Apr 24 14:26:31.824960 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:31.824938 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" event={"ID":"ca3432a7-7fcd-4793-933f-b84d886dc761","Type":"ContainerStarted","Data":"3066b440a05e00fc698f55d0f3fdfa53136f665768fea8d84816fe0dcdd42a23"} Apr 24 14:26:32.242128 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.242079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:32.242949 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.242927 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:32.523784 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.523701 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:26:32.713621 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:32.713591 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2ggts" podUID="3f0062e0-6c81-4d0d-a829-f8f572d6038e" Apr 24 14:26:32.734038 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:32.734012 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tvx6q" podUID="cf952f8e-c033-4ad1-a839-92bb755b49cc" Apr 24 14:26:32.758954 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.758910 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:26:32.770788 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:32.770755 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7503d8e2_7236_48d2_b4d5_1cd9cdc2da28.slice/crio-71eed2f49f57fbb95598569e1f3c526339b175ef7d377814a9bc307affb02b6a WatchSource:0}: Error finding container 71eed2f49f57fbb95598569e1f3c526339b175ef7d377814a9bc307affb02b6a: Status 404 returned error can't find the container with id 71eed2f49f57fbb95598569e1f3c526339b175ef7d377814a9bc307affb02b6a Apr 24 14:26:32.832146 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.831079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xrfbk" event={"ID":"990c1e6d-4603-492a-b0d1-b0d498ef3c6e","Type":"ContainerStarted","Data":"0268d5c1f65c97f19f46301a37498234b17feb1a903e8d5a0909130ea1a26397"} Apr 24 14:26:32.836365 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.835591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" event={"ID":"7d61f999-ef5b-4a64-b56f-54f94755779c","Type":"ContainerStarted","Data":"e3801f081380a72d13ee33d313090292b6a4ff8706a36ce37efc9495560643ea"} Apr 24 14:26:32.836365 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.835628 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" event={"ID":"7d61f999-ef5b-4a64-b56f-54f94755779c","Type":"ContainerStarted","Data":"3843a54cf0fdd0f6da315d6b085de495eda451e4d1934f78398cacb7864f734c"} Apr 24 14:26:32.839163 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.839108 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" event={"ID":"ca3432a7-7fcd-4793-933f-b84d886dc761","Type":"ContainerStarted","Data":"454aaf34b18357ced04426936671987a105341bcb6fbdef8c6c5ba9bfceb661f"} Apr 24 14:26:32.841233 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.840987 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2ggts" Apr 24 14:26:32.841233 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.841040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerStarted","Data":"71eed2f49f57fbb95598569e1f3c526339b175ef7d377814a9bc307affb02b6a"} Apr 24 14:26:32.866770 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:32.866702 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t8fcs" podStartSLOduration=1.306861628 podStartE2EDuration="2.866683835s" podCreationTimestamp="2026-04-24 14:26:30 +0000 UTC" firstStartedPulling="2026-04-24 14:26:31.066145887 +0000 UTC m=+154.277997240" lastFinishedPulling="2026-04-24 14:26:32.625968075 +0000 UTC m=+155.837819447" observedRunningTime="2026-04-24 14:26:32.866296984 +0000 UTC m=+156.078148360" watchObservedRunningTime="2026-04-24 14:26:32.866683835 +0000 UTC m=+156.078535211" Apr 24 14:26:33.848233 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:33.848196 2574 generic.go:358] "Generic (PLEG): container finished" podID="990c1e6d-4603-492a-b0d1-b0d498ef3c6e" containerID="0268d5c1f65c97f19f46301a37498234b17feb1a903e8d5a0909130ea1a26397" exitCode=0 Apr 24 14:26:33.848686 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:33.848298 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xrfbk" event={"ID":"990c1e6d-4603-492a-b0d1-b0d498ef3c6e","Type":"ContainerDied","Data":"0268d5c1f65c97f19f46301a37498234b17feb1a903e8d5a0909130ea1a26397"} Apr 24 14:26:33.850458 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:33.850432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" event={"ID":"7d61f999-ef5b-4a64-b56f-54f94755779c","Type":"ContainerStarted","Data":"4deeb48f35c50d485905572332a4a05fbfae36d0cb74181fc9a7aee06dacc306"} Apr 24 14:26:33.879846 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:33.879775 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-drv2m" podStartSLOduration=2.2178292 podStartE2EDuration="3.879760919s" podCreationTimestamp="2026-04-24 14:26:30 +0000 UTC" firstStartedPulling="2026-04-24 14:26:30.96417772 +0000 UTC m=+154.176029086" lastFinishedPulling="2026-04-24 14:26:32.626109436 +0000 UTC m=+155.837960805" observedRunningTime="2026-04-24 14:26:33.878433999 +0000 UTC m=+157.090285376" watchObservedRunningTime="2026-04-24 14:26:33.879760919 +0000 UTC m=+157.091612296" Apr 24 14:26:34.856544 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:34.856507 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xrfbk" event={"ID":"990c1e6d-4603-492a-b0d1-b0d498ef3c6e","Type":"ContainerStarted","Data":"e4acd31c7d7ac05928867384e176c60f84547e6ec368ed9ce3e2b30bb9d73ea9"} Apr 24 14:26:34.856951 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:34.856552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xrfbk" event={"ID":"990c1e6d-4603-492a-b0d1-b0d498ef3c6e","Type":"ContainerStarted","Data":"c57a5520a5e02a9f0e6024831f9d972d53e1e043ea1eccb67299700a5b29585d"} Apr 24 14:26:34.857895 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:34.857869 2574 generic.go:358] "Generic (PLEG): container finished" podID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerID="d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca" exitCode=0 Apr 24 14:26:34.858027 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:34.857951 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerDied","Data":"d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca"} Apr 24 14:26:34.882603 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:34.882548 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xrfbk" podStartSLOduration=3.718065718 podStartE2EDuration="4.882533689s" podCreationTimestamp="2026-04-24 14:26:30 +0000 UTC" firstStartedPulling="2026-04-24 14:26:31.464646366 +0000 UTC m=+154.676497733" lastFinishedPulling="2026-04-24 14:26:32.629114336 +0000 UTC m=+155.840965704" observedRunningTime="2026-04-24 14:26:34.880860414 +0000 UTC m=+158.092711789" watchObservedRunningTime="2026-04-24 14:26:34.882533689 +0000 UTC m=+158.094385064" Apr 24 14:26:35.293206 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.293156 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb"] Apr 24 14:26:35.296615 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.296595 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" Apr 24 14:26:35.298577 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.298556 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 14:26:35.298577 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.298573 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-c9p4l\"" Apr 24 14:26:35.304736 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.304708 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb"] Apr 24 14:26:35.369894 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.369862 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2539fa5c-3160-43bd-a351-0184602b72e3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cltlb\" (UID: \"2539fa5c-3160-43bd-a351-0184602b72e3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" Apr 24 14:26:35.471297 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.471255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2539fa5c-3160-43bd-a351-0184602b72e3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cltlb\" (UID: \"2539fa5c-3160-43bd-a351-0184602b72e3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" Apr 24 14:26:35.471474 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:35.471458 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 14:26:35.471554 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:26:35.471541 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2539fa5c-3160-43bd-a351-0184602b72e3-monitoring-plugin-cert podName:2539fa5c-3160-43bd-a351-0184602b72e3 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:35.97151786 +0000 UTC m=+159.183369213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/2539fa5c-3160-43bd-a351-0184602b72e3-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-cltlb" (UID: "2539fa5c-3160-43bd-a351-0184602b72e3") : secret "monitoring-plugin-cert" not found Apr 24 14:26:35.976189 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.976145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2539fa5c-3160-43bd-a351-0184602b72e3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cltlb\" (UID: \"2539fa5c-3160-43bd-a351-0184602b72e3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" Apr 24 14:26:35.979379 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:35.979352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2539fa5c-3160-43bd-a351-0184602b72e3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cltlb\" (UID: \"2539fa5c-3160-43bd-a351-0184602b72e3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" Apr 24 14:26:36.205982 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:36.205936 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" Apr 24 14:26:36.327556 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:36.327486 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb"] Apr 24 14:26:36.330285 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:36.330258 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2539fa5c_3160_43bd_a351_0184602b72e3.slice/crio-f7ece8fb611c64cae37d94f4c3319597ce2d8e6eeede4c9013969596255f70fd WatchSource:0}: Error finding container f7ece8fb611c64cae37d94f4c3319597ce2d8e6eeede4c9013969596255f70fd: Status 404 returned error can't find the container with id f7ece8fb611c64cae37d94f4c3319597ce2d8e6eeede4c9013969596255f70fd Apr 24 14:26:36.869093 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:36.869050 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerStarted","Data":"d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f"} Apr 24 14:26:36.869093 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:36.869094 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerStarted","Data":"d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a"} Apr 24 14:26:36.869338 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:36.869109 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerStarted","Data":"aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f"} Apr 24 14:26:36.869338 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:36.869121 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerStarted","Data":"37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9"} Apr 24 14:26:36.869338 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:36.869133 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerStarted","Data":"d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e"} Apr 24 14:26:36.870252 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:36.870220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" event={"ID":"2539fa5c-3160-43bd-a351-0184602b72e3","Type":"ContainerStarted","Data":"f7ece8fb611c64cae37d94f4c3319597ce2d8e6eeede4c9013969596255f70fd"} Apr 24 14:26:37.692204 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.692174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:26:37.692474 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.692276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:26:37.694430 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.694412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f0062e0-6c81-4d0d-a829-f8f572d6038e-metrics-tls\") pod \"dns-default-2ggts\" (UID: \"3f0062e0-6c81-4d0d-a829-f8f572d6038e\") " pod="openshift-dns/dns-default-2ggts" Apr 24 14:26:37.694567 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.694551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf952f8e-c033-4ad1-a839-92bb755b49cc-cert\") pod \"ingress-canary-tvx6q\" (UID: \"cf952f8e-c033-4ad1-a839-92bb755b49cc\") " pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:26:37.874924 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.874889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" event={"ID":"2539fa5c-3160-43bd-a351-0184602b72e3","Type":"ContainerStarted","Data":"226ee1d02972c514e92e6741f6640a852eed7beead9bb6ff15880659cd1ab541"} Apr 24 14:26:37.875108 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.875083 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" Apr 24 14:26:37.877963 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.877937 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerStarted","Data":"e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc"} Apr 24 14:26:37.880536 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.880516 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" Apr 24 14:26:37.889414 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.889353 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cltlb" podStartSLOduration=1.563332688 podStartE2EDuration="2.889341992s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:26:36.332164502 +0000 UTC m=+159.544015854" lastFinishedPulling="2026-04-24 14:26:37.658173792 +0000 UTC m=+160.870025158" observedRunningTime="2026-04-24 14:26:37.887817807 +0000 UTC m=+161.099669181" watchObservedRunningTime="2026-04-24 14:26:37.889341992 +0000 UTC m=+161.101193417" Apr 24 14:26:37.924849 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.924724 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.037931184 podStartE2EDuration="6.924706145s" podCreationTimestamp="2026-04-24 14:26:31 +0000 UTC" firstStartedPulling="2026-04-24 14:26:32.772547898 +0000 UTC m=+155.984399255" lastFinishedPulling="2026-04-24 14:26:37.659322862 +0000 UTC m=+160.871174216" observedRunningTime="2026-04-24 14:26:37.923656112 +0000 UTC m=+161.135507488" watchObservedRunningTime="2026-04-24 14:26:37.924706145 +0000 UTC m=+161.136557521" Apr 24 14:26:37.945606 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.943895 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m8ml5\"" Apr 24 14:26:37.952335 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:37.952315 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2ggts" Apr 24 14:26:38.082239 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:38.082213 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2ggts"] Apr 24 14:26:38.084501 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:38.084474 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f0062e0_6c81_4d0d_a829_f8f572d6038e.slice/crio-bc3d97e8b6a79669325749c0acb048c2d6191439130384b7afb011dd11a349a9 WatchSource:0}: Error finding container bc3d97e8b6a79669325749c0acb048c2d6191439130384b7afb011dd11a349a9: Status 404 returned error can't find the container with id bc3d97e8b6a79669325749c0acb048c2d6191439130384b7afb011dd11a349a9 Apr 24 14:26:38.882891 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:38.882840 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2ggts" event={"ID":"3f0062e0-6c81-4d0d-a829-f8f572d6038e","Type":"ContainerStarted","Data":"bc3d97e8b6a79669325749c0acb048c2d6191439130384b7afb011dd11a349a9"} Apr 24 14:26:39.889365 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:39.889324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2ggts" event={"ID":"3f0062e0-6c81-4d0d-a829-f8f572d6038e","Type":"ContainerStarted","Data":"c888477cc4cd7fcaf8266d385fcfcb497ea952d48d7df46215a792fca306b94c"} Apr 24 14:26:39.889816 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:39.889376 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2ggts" event={"ID":"3f0062e0-6c81-4d0d-a829-f8f572d6038e","Type":"ContainerStarted","Data":"3bc927f7780011af41df1c933895ceebe37bddeae4f44afeb0a734c4beb2f51b"} Apr 24 14:26:39.905262 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:39.904069 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2ggts" podStartSLOduration=129.644470751 podStartE2EDuration="2m10.904052569s" podCreationTimestamp="2026-04-24 14:24:29 +0000 UTC" firstStartedPulling="2026-04-24 14:26:38.086281442 +0000 UTC m=+161.298132796" lastFinishedPulling="2026-04-24 14:26:39.34586326 +0000 UTC m=+162.557714614" observedRunningTime="2026-04-24 14:26:39.903082354 +0000 UTC m=+163.114933733" watchObservedRunningTime="2026-04-24 14:26:39.904052569 +0000 UTC m=+163.115903945" Apr 24 14:26:40.892660 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:40.892627 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2ggts" Apr 24 14:26:41.354734 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:41.354703 2574 scope.go:117] "RemoveContainer" containerID="14fb7deabd822e761fcf64962b9b639c4d3d83a4ed608e5554d91e2d65c02809" Apr 24 14:26:41.897181 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:41.897150 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:26:41.897584 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:41.897216 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" event={"ID":"ff3b99d4-3afa-4687-b6b7-7d3526edbcf4","Type":"ContainerStarted","Data":"76796af7196764f9180faf321d793fd2749026308efa1d89995ffca0c5f693cf"} Apr 24 14:26:41.897745 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:41.897718 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:26:41.912702 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:41.912651 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" podStartSLOduration=51.547635758 podStartE2EDuration="53.912636411s" podCreationTimestamp="2026-04-24 14:25:48 +0000 UTC" firstStartedPulling="2026-04-24 14:25:49.297161638 +0000 UTC m=+112.509013004" lastFinishedPulling="2026-04-24 14:25:51.662162305 +0000 UTC m=+114.874013657" observedRunningTime="2026-04-24 14:26:41.910707077 +0000 UTC m=+165.122558452" watchObservedRunningTime="2026-04-24 14:26:41.912636411 +0000 UTC m=+165.124487787" Apr 24 14:26:41.994314 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:41.994286 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-v7trz" Apr 24 14:26:42.172791 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.172705 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-xcrgl"] Apr 24 14:26:42.176137 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.176111 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xcrgl" Apr 24 14:26:42.178364 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.178342 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-p8z9g\"" Apr 24 14:26:42.178526 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.178418 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 14:26:42.178526 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.178427 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 14:26:42.190243 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.190223 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xcrgl"] Apr 24 14:26:42.230268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.230246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwk8\" (UniqueName: \"kubernetes.io/projected/149cb833-9aef-4e87-9532-449279ed8f7e-kube-api-access-htwk8\") pod \"downloads-6bcc868b7-xcrgl\" (UID: \"149cb833-9aef-4e87-9532-449279ed8f7e\") " pod="openshift-console/downloads-6bcc868b7-xcrgl" Apr 24 14:26:42.331163 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.331134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htwk8\" (UniqueName: \"kubernetes.io/projected/149cb833-9aef-4e87-9532-449279ed8f7e-kube-api-access-htwk8\") pod \"downloads-6bcc868b7-xcrgl\" (UID: \"149cb833-9aef-4e87-9532-449279ed8f7e\") " pod="openshift-console/downloads-6bcc868b7-xcrgl" Apr 24 14:26:42.339151 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.339122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwk8\" (UniqueName: \"kubernetes.io/projected/149cb833-9aef-4e87-9532-449279ed8f7e-kube-api-access-htwk8\") pod \"downloads-6bcc868b7-xcrgl\" (UID: \"149cb833-9aef-4e87-9532-449279ed8f7e\") " pod="openshift-console/downloads-6bcc868b7-xcrgl" Apr 24 14:26:42.485066 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.484990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xcrgl" Apr 24 14:26:42.601352 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.601321 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xcrgl"] Apr 24 14:26:42.604613 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:42.604588 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod149cb833_9aef_4e87_9532_449279ed8f7e.slice/crio-171a25bad9bd6e2a9ad5245e1a2b3af5720b8c55c31e3e41495cdb83af9f8034 WatchSource:0}: Error finding container 171a25bad9bd6e2a9ad5245e1a2b3af5720b8c55c31e3e41495cdb83af9f8034: Status 404 returned error can't find the container with id 171a25bad9bd6e2a9ad5245e1a2b3af5720b8c55c31e3e41495cdb83af9f8034 Apr 24 14:26:42.901645 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:42.901611 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xcrgl" event={"ID":"149cb833-9aef-4e87-9532-449279ed8f7e","Type":"ContainerStarted","Data":"171a25bad9bd6e2a9ad5245e1a2b3af5720b8c55c31e3e41495cdb83af9f8034"} Apr 24 14:26:47.358311 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:47.358217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:26:47.360338 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:47.360314 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-84zkd\"" Apr 24 14:26:47.368997 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:47.368978 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tvx6q" Apr 24 14:26:47.505512 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:47.505452 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tvx6q"] Apr 24 14:26:47.510347 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:26:47.509517 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf952f8e_c033_4ad1_a839_92bb755b49cc.slice/crio-96a9f32652b13996caafd96dc46afa58b3fa1ef0fedcc0d4dad31a2fb4574c86 WatchSource:0}: Error finding container 96a9f32652b13996caafd96dc46afa58b3fa1ef0fedcc0d4dad31a2fb4574c86: Status 404 returned error can't find the container with id 96a9f32652b13996caafd96dc46afa58b3fa1ef0fedcc0d4dad31a2fb4574c86 Apr 24 14:26:47.920568 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:47.920525 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tvx6q" event={"ID":"cf952f8e-c033-4ad1-a839-92bb755b49cc","Type":"ContainerStarted","Data":"96a9f32652b13996caafd96dc46afa58b3fa1ef0fedcc0d4dad31a2fb4574c86"} Apr 24 14:26:49.928312 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:49.928274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tvx6q" event={"ID":"cf952f8e-c033-4ad1-a839-92bb755b49cc","Type":"ContainerStarted","Data":"a18fe1ec9f98b06986278575fe91c7cb87f3297b3f16731887e5b15eea32c261"} Apr 24 14:26:49.945685 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:49.945620 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tvx6q" podStartSLOduration=139.326369032 podStartE2EDuration="2m20.9456028s" podCreationTimestamp="2026-04-24 14:24:29 +0000 UTC" firstStartedPulling="2026-04-24 14:26:47.512615926 +0000 UTC m=+170.724467279" lastFinishedPulling="2026-04-24 14:26:49.131849688 +0000 UTC m=+172.343701047" observedRunningTime="2026-04-24 14:26:49.944704256 +0000 UTC m=+173.156555628" watchObservedRunningTime="2026-04-24 14:26:49.9456028 +0000 UTC m=+173.157454176" Apr 24 14:26:50.900407 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:50.900358 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2ggts" Apr 24 14:26:54.177574 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.177534 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5995c847b4-5667k"] Apr 24 14:26:54.180531 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.180510 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.182678 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.182656 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 14:26:54.183238 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.183149 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 14:26:54.183238 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.183161 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 14:26:54.183238 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.183149 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 14:26:54.183238 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.183183 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8c8rc\"" Apr 24 14:26:54.183238 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.183170 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 14:26:54.189787 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.189749 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5995c847b4-5667k"] Apr 24 14:26:54.333744 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.333713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-serving-cert\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.333924 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.333797 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-config\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.333924 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.333829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-service-ca\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.333924 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.333857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-oauth-config\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.334113 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.334000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8rvs\" (UniqueName: \"kubernetes.io/projected/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-kube-api-access-b8rvs\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.334113 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.334042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-oauth-serving-cert\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.434694 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.434613 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-oauth-serving-cert\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.434694 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.434676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-serving-cert\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.434917 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.434731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-config\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.434917 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.434759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-service-ca\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.434917 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.434797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-oauth-config\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.435069 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.434981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8rvs\" (UniqueName: \"kubernetes.io/projected/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-kube-api-access-b8rvs\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.435454 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.435428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-config\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.435611 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.435433 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-oauth-serving-cert\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.435805 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.435780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-service-ca\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.437533 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.437513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-serving-cert\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.437709 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.437689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-oauth-config\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.442550 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.442527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8rvs\" (UniqueName: \"kubernetes.io/projected/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-kube-api-access-b8rvs\") pod \"console-5995c847b4-5667k\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:26:54.491487 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:26:54.491438 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:27:00.324875 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:00.324845 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5995c847b4-5667k"] Apr 24 14:27:00.327847 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:27:00.327820 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6f58cc_440c_431b_a0eb_e5ed60bcdd10.slice/crio-d3a87ab9cfb29fd95dec34c5a8e25fa2110da435f74be8a9a1e7403883a04b6d WatchSource:0}: Error finding container d3a87ab9cfb29fd95dec34c5a8e25fa2110da435f74be8a9a1e7403883a04b6d: Status 404 returned error can't find the container with id d3a87ab9cfb29fd95dec34c5a8e25fa2110da435f74be8a9a1e7403883a04b6d Apr 24 14:27:00.969828 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:00.969779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xcrgl" event={"ID":"149cb833-9aef-4e87-9532-449279ed8f7e","Type":"ContainerStarted","Data":"47a8ed329091d24e7354c866f6e4c997e0dcbc5bd5515df4045234a67e9d6aa2"} Apr 24 14:27:00.972657 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:00.972452 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-xcrgl" Apr 24 14:27:00.975036 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:00.975005 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5995c847b4-5667k" event={"ID":"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10","Type":"ContainerStarted","Data":"d3a87ab9cfb29fd95dec34c5a8e25fa2110da435f74be8a9a1e7403883a04b6d"} Apr 24 14:27:00.981711 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:00.981687 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-xcrgl" Apr 24 14:27:00.988818 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:00.988751 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-xcrgl" podStartSLOduration=1.290764962 podStartE2EDuration="18.988736604s" podCreationTimestamp="2026-04-24 14:26:42 +0000 UTC" firstStartedPulling="2026-04-24 14:26:42.60662057 +0000 UTC m=+165.818471923" lastFinishedPulling="2026-04-24 14:27:00.304592212 +0000 UTC m=+183.516443565" observedRunningTime="2026-04-24 14:27:00.986836576 +0000 UTC m=+184.198687951" watchObservedRunningTime="2026-04-24 14:27:00.988736604 +0000 UTC m=+184.200587980" Apr 24 14:27:03.986274 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:03.986227 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5995c847b4-5667k" event={"ID":"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10","Type":"ContainerStarted","Data":"38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa"} Apr 24 14:27:04.001809 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.001715 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5995c847b4-5667k" podStartSLOduration=6.57891374 podStartE2EDuration="10.001696806s" podCreationTimestamp="2026-04-24 14:26:54 +0000 UTC" firstStartedPulling="2026-04-24 14:27:00.32960674 +0000 UTC m=+183.541458093" lastFinishedPulling="2026-04-24 14:27:03.752389794 +0000 UTC m=+186.964241159" observedRunningTime="2026-04-24 14:27:04.000446232 +0000 UTC m=+187.212297611" watchObservedRunningTime="2026-04-24 14:27:04.001696806 +0000 UTC m=+187.213548183" Apr 24 14:27:04.473040 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.473002 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59f697d754-wgwds"] Apr 24 14:27:04.491605 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.491576 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:27:04.491769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.491611 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59f697d754-wgwds"] Apr 24 14:27:04.491769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.491648 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:27:04.491769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.491767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.497260 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.497208 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:27:04.499163 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.499141 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 14:27:04.626317 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.626271 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-service-ca\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.626317 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.626317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-oauth-config\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.626551 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.626373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-trusted-ca-bundle\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.626551 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.626481 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rjp\" (UniqueName: \"kubernetes.io/projected/d265cb03-77c2-4e91-ba30-777af584dfd1-kube-api-access-w8rjp\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.626551 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.626525 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-serving-cert\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.626681 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.626549 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-console-config\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.626681 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.626633 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-oauth-serving-cert\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.727191 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.727110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-serving-cert\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.727191 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.727157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-console-config\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.727494 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.727203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-oauth-serving-cert\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.727494 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.727263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-service-ca\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.727494 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.727291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-oauth-config\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.727494 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.727330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-trusted-ca-bundle\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.727494 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.727357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rjp\" (UniqueName: \"kubernetes.io/projected/d265cb03-77c2-4e91-ba30-777af584dfd1-kube-api-access-w8rjp\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.728202 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.728148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-oauth-serving-cert\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.728322 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.728289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-trusted-ca-bundle\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.728676 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.728653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-console-config\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.730000 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.729963 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-oauth-config\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.730108 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.730086 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-serving-cert\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.734989 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.734965 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rjp\" (UniqueName: \"kubernetes.io/projected/d265cb03-77c2-4e91-ba30-777af584dfd1-kube-api-access-w8rjp\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.739448 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.739427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-service-ca\") pod \"console-59f697d754-wgwds\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.804553 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.804519 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:04.939717 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.939689 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59f697d754-wgwds"] Apr 24 14:27:04.942663 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:27:04.942621 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd265cb03_77c2_4e91_ba30_777af584dfd1.slice/crio-3c3dfe019e50290c37846e2c23c12aae8518bb11241776f793462cf4dd36edb0 WatchSource:0}: Error finding container 3c3dfe019e50290c37846e2c23c12aae8518bb11241776f793462cf4dd36edb0: Status 404 returned error can't find the container with id 3c3dfe019e50290c37846e2c23c12aae8518bb11241776f793462cf4dd36edb0 Apr 24 14:27:04.990364 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.990279 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f697d754-wgwds" event={"ID":"d265cb03-77c2-4e91-ba30-777af584dfd1","Type":"ContainerStarted","Data":"3c3dfe019e50290c37846e2c23c12aae8518bb11241776f793462cf4dd36edb0"} Apr 24 14:27:04.995108 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:04.995088 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:27:05.995083 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:05.995043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f697d754-wgwds" event={"ID":"d265cb03-77c2-4e91-ba30-777af584dfd1","Type":"ContainerStarted","Data":"5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981"} Apr 24 14:27:06.011226 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:06.011160 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59f697d754-wgwds" podStartSLOduration=2.011142356 podStartE2EDuration="2.011142356s" podCreationTimestamp="2026-04-24 14:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:27:06.010066845 +0000 UTC m=+189.221918232" watchObservedRunningTime="2026-04-24 14:27:06.011142356 +0000 UTC m=+189.222993734" Apr 24 14:27:09.773735 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:09.773709 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7503d8e2-7236-48d2-b4d5-1cd9cdc2da28/init-config-reloader/0.log" Apr 24 14:27:09.973657 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:09.973624 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7503d8e2-7236-48d2-b4d5-1cd9cdc2da28/alertmanager/0.log" Apr 24 14:27:10.173650 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:10.173620 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7503d8e2-7236-48d2-b4d5-1cd9cdc2da28/config-reloader/0.log" Apr 24 14:27:10.373906 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:10.373873 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7503d8e2-7236-48d2-b4d5-1cd9cdc2da28/kube-rbac-proxy-web/0.log" Apr 24 14:27:10.574535 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:10.574512 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7503d8e2-7236-48d2-b4d5-1cd9cdc2da28/kube-rbac-proxy/0.log" Apr 24 14:27:10.776923 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:10.776895 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7503d8e2-7236-48d2-b4d5-1cd9cdc2da28/kube-rbac-proxy-metric/0.log" Apr 24 14:27:10.973769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:10.973704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7503d8e2-7236-48d2-b4d5-1cd9cdc2da28/prom-label-proxy/0.log" Apr 24 14:27:11.373437 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:11.373409 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-drv2m_7d61f999-ef5b-4a64-b56f-54f94755779c/kube-state-metrics/0.log" Apr 24 14:27:11.574950 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:11.574923 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-drv2m_7d61f999-ef5b-4a64-b56f-54f94755779c/kube-rbac-proxy-main/0.log" Apr 24 14:27:11.772959 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:11.772888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-drv2m_7d61f999-ef5b-4a64-b56f-54f94755779c/kube-rbac-proxy-self/0.log" Apr 24 14:27:12.173203 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:12.173171 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-cltlb_2539fa5c-3160-43bd-a351-0184602b72e3/monitoring-plugin/0.log" Apr 24 14:27:13.573173 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:13.573150 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xrfbk_990c1e6d-4603-492a-b0d1-b0d498ef3c6e/init-textfile/0.log" Apr 24 14:27:13.773832 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:13.773807 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xrfbk_990c1e6d-4603-492a-b0d1-b0d498ef3c6e/node-exporter/0.log" Apr 24 14:27:13.973572 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:13.973500 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xrfbk_990c1e6d-4603-492a-b0d1-b0d498ef3c6e/kube-rbac-proxy/0.log" Apr 24 14:27:14.174620 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:14.174586 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t8fcs_ca3432a7-7fcd-4793-933f-b84d886dc761/kube-rbac-proxy-main/0.log" Apr 24 14:27:14.373081 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:14.373052 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t8fcs_ca3432a7-7fcd-4793-933f-b84d886dc761/kube-rbac-proxy-self/0.log" Apr 24 14:27:14.572996 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:14.572970 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t8fcs_ca3432a7-7fcd-4793-933f-b84d886dc761/openshift-state-metrics/0.log" Apr 24 14:27:14.805381 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:14.805357 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:14.805783 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:14.805446 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:14.810228 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:14.810209 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:15.025657 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:15.025629 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:27:15.071233 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:15.071152 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5995c847b4-5667k"] Apr 24 14:27:16.174255 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:16.174214 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ls5jw_9374d6dc-31b7-464b-a614-4cd5ce83fdbb/prometheus-operator/0.log" Apr 24 14:27:16.375672 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:16.375645 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ls5jw_9374d6dc-31b7-464b-a614-4cd5ce83fdbb/kube-rbac-proxy/0.log" Apr 24 14:27:16.573286 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:16.573262 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-z2grl_2a817927-9d20-4e56-a0bf-0223603b5b85/prometheus-operator-admission-webhook/0.log" Apr 24 14:27:18.573790 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:18.573757 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-hwdq2_0ad0da81-ab22-438d-911a-36e1a74dba1f/networking-console-plugin/0.log" Apr 24 14:27:18.773244 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:18.773215 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:27:18.975134 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:18.975065 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/3.log" Apr 24 14:27:19.173714 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:19.173687 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5995c847b4-5667k_0f6f58cc-440c-431b-a0eb-e5ed60bcdd10/console/0.log" Apr 24 14:27:19.373573 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:19.373544 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59f697d754-wgwds_d265cb03-77c2-4e91-ba30-777af584dfd1/console/0.log" Apr 24 14:27:19.574094 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:19.574066 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-xcrgl_149cb833-9aef-4e87-9532-449279ed8f7e/download-server/0.log" Apr 24 14:27:19.773657 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:19.773594 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-577fb5f5fd-t2ghs_8a1f01af-d685-4103-bebf-0d55fcb83c35/router/0.log" Apr 24 14:27:20.374417 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:20.374373 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tvx6q_cf952f8e-c033-4ad1-a839-92bb755b49cc/serve-healthcheck-canary/0.log" Apr 24 14:27:23.054057 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:23.054013 2574 generic.go:358] "Generic (PLEG): container finished" podID="49abdf75-9c98-4426-953c-83a9aa6a3869" containerID="e53d23e5acb2b51d6926ffe6c14702d341769c0a57c1da8b9b70b25ea6b03f72" exitCode=0 Apr 24 14:27:23.054466 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:23.054080 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" event={"ID":"49abdf75-9c98-4426-953c-83a9aa6a3869","Type":"ContainerDied","Data":"e53d23e5acb2b51d6926ffe6c14702d341769c0a57c1da8b9b70b25ea6b03f72"} Apr 24 14:27:23.054466 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:23.054435 2574 scope.go:117] "RemoveContainer" containerID="e53d23e5acb2b51d6926ffe6c14702d341769c0a57c1da8b9b70b25ea6b03f72" Apr 24 14:27:24.067978 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:24.067940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g5f7n" event={"ID":"49abdf75-9c98-4426-953c-83a9aa6a3869","Type":"ContainerStarted","Data":"a2bc8c5e9fb64d3b7b16469dea6521b85f9693f910f060496e8660cab9ea2576"} Apr 24 14:27:40.092989 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.092931 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5995c847b4-5667k" podUID="0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" containerName="console" containerID="cri-o://38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa" gracePeriod=15 Apr 24 14:27:40.366935 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.366914 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5995c847b4-5667k_0f6f58cc-440c-431b-a0eb-e5ed60bcdd10/console/0.log" Apr 24 14:27:40.367043 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.366972 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:27:40.451829 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.451797 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-service-ca\") pod \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " Apr 24 14:27:40.451981 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.451846 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-oauth-config\") pod \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " Apr 24 14:27:40.451981 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.451873 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-oauth-serving-cert\") pod \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " Apr 24 14:27:40.451981 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.451904 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-config\") pod \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " Apr 24 14:27:40.451981 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.451921 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8rvs\" (UniqueName: \"kubernetes.io/projected/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-kube-api-access-b8rvs\") pod \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " Apr 24 14:27:40.451981 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.451945 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-serving-cert\") pod \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\" (UID: \"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10\") " Apr 24 14:27:40.452285 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.452256 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-service-ca" (OuterVolumeSpecName: "service-ca") pod "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" (UID: "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:40.452349 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.452304 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" (UID: "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:40.452349 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.452328 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-config" (OuterVolumeSpecName: "console-config") pod "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" (UID: "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:40.454234 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.454202 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" (UID: "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:40.454336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.454234 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-kube-api-access-b8rvs" (OuterVolumeSpecName: "kube-api-access-b8rvs") pod "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" (UID: "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10"). InnerVolumeSpecName "kube-api-access-b8rvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:40.454336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.454250 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" (UID: "0f6f58cc-440c-431b-a0eb-e5ed60bcdd10"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:40.553342 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.553321 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:40.553342 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.553342 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-service-ca\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:40.553490 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.553352 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-oauth-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:40.553490 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.553361 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-oauth-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:40.553490 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.553369 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-console-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:40.553490 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:40.553378 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b8rvs\" (UniqueName: \"kubernetes.io/projected/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10-kube-api-access-b8rvs\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:41.116529 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.116501 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5995c847b4-5667k_0f6f58cc-440c-431b-a0eb-e5ed60bcdd10/console/0.log" Apr 24 14:27:41.116907 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.116542 2574 generic.go:358] "Generic (PLEG): container finished" podID="0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" containerID="38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa" exitCode=2 Apr 24 14:27:41.116907 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.116600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5995c847b4-5667k" event={"ID":"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10","Type":"ContainerDied","Data":"38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa"} Apr 24 14:27:41.116907 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.116617 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5995c847b4-5667k" Apr 24 14:27:41.116907 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.116632 2574 scope.go:117] "RemoveContainer" containerID="38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa" Apr 24 14:27:41.116907 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.116623 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5995c847b4-5667k" event={"ID":"0f6f58cc-440c-431b-a0eb-e5ed60bcdd10","Type":"ContainerDied","Data":"d3a87ab9cfb29fd95dec34c5a8e25fa2110da435f74be8a9a1e7403883a04b6d"} Apr 24 14:27:41.129296 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.129277 2574 scope.go:117] "RemoveContainer" containerID="38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa" Apr 24 14:27:41.129588 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:27:41.129566 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa\": container with ID starting with 38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa not found: ID does not exist" containerID="38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa" Apr 24 14:27:41.129644 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.129595 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa"} err="failed to get container status \"38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa\": rpc error: code = NotFound desc = could not find container \"38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa\": container with ID starting with 38e04ef664b2f942cbc3c069555ab513fa321847e4a8514a0d6453f121280aaa not found: ID does not exist" Apr 24 14:27:41.139756 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.139734 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5995c847b4-5667k"] Apr 24 14:27:41.142984 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.142961 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5995c847b4-5667k"] Apr 24 14:27:41.358892 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:41.358862 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" path="/var/lib/kubelet/pods/0f6f58cc-440c-431b-a0eb-e5ed60bcdd10/volumes" Apr 24 14:27:50.856630 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:50.856589 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:50.857282 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:50.857223 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="alertmanager" containerID="cri-o://d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e" gracePeriod=120 Apr 24 14:27:50.857445 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:50.857261 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy-metric" containerID="cri-o://d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f" gracePeriod=120 Apr 24 14:27:50.857445 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:50.857286 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="prom-label-proxy" containerID="cri-o://e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc" gracePeriod=120 Apr 24 14:27:50.857445 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:50.857312 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy-web" containerID="cri-o://aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f" gracePeriod=120 Apr 24 14:27:50.857445 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:50.857360 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy" containerID="cri-o://d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a" gracePeriod=120 Apr 24 14:27:50.857445 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:50.857380 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="config-reloader" containerID="cri-o://37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9" gracePeriod=120 Apr 24 14:27:51.148409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148319 2574 generic.go:358] "Generic (PLEG): container finished" podID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerID="e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc" exitCode=0 Apr 24 14:27:51.148409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148344 2574 generic.go:358] "Generic (PLEG): container finished" podID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerID="d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f" exitCode=0 Apr 24 14:27:51.148409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148350 2574 generic.go:358] "Generic (PLEG): container finished" podID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerID="d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a" exitCode=0 Apr 24 14:27:51.148409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148355 2574 generic.go:358] "Generic (PLEG): container finished" podID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerID="37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9" exitCode=0 Apr 24 14:27:51.148409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148360 2574 generic.go:358] "Generic (PLEG): container finished" podID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerID="d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e" exitCode=0 Apr 24 14:27:51.148652 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerDied","Data":"e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc"} Apr 24 14:27:51.148652 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerDied","Data":"d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f"} Apr 24 14:27:51.148652 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148460 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerDied","Data":"d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a"} Apr 24 14:27:51.148652 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148470 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerDied","Data":"37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9"} Apr 24 14:27:51.148652 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:51.148479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerDied","Data":"d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e"} Apr 24 14:27:52.085710 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.085688 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146351 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-ddf66748b-4gqzz"] Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146761 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="prom-label-proxy" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146777 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="prom-label-proxy" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146800 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="init-config-reloader" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146809 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="init-config-reloader" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146820 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy-metric" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146828 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy-metric" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146839 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146848 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146864 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="alertmanager" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146873 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="alertmanager" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146883 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="config-reloader" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146891 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="config-reloader" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146901 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy-web" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146909 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy-web" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146919 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" containerName="console" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.146928 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" containerName="console" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.147029 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.147040 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="prom-label-proxy" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.147051 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="config-reloader" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.147062 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="alertmanager" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.147071 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy-web" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.147082 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerName="kube-rbac-proxy-metric" Apr 24 14:27:52.147420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.147090 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f6f58cc-440c-431b-a0eb-e5ed60bcdd10" containerName="console" Apr 24 14:27:52.153008 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.152971 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.159691 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.159663 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ddf66748b-4gqzz"] Apr 24 14:27:52.160283 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.160105 2574 generic.go:358] "Generic (PLEG): container finished" podID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" containerID="aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f" exitCode=0 Apr 24 14:27:52.160283 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.160225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerDied","Data":"aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f"} Apr 24 14:27:52.160283 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.160254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28","Type":"ContainerDied","Data":"71eed2f49f57fbb95598569e1f3c526339b175ef7d377814a9bc307affb02b6a"} Apr 24 14:27:52.160283 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.160279 2574 scope.go:117] "RemoveContainer" containerID="e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc" Apr 24 14:27:52.160592 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.160512 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.170377 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.170349 2574 scope.go:117] "RemoveContainer" containerID="d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f" Apr 24 14:27:52.180343 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.180325 2574 scope.go:117] "RemoveContainer" containerID="d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a" Apr 24 14:27:52.187328 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.187300 2574 scope.go:117] "RemoveContainer" containerID="aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f" Apr 24 14:27:52.193855 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.193832 2574 scope.go:117] "RemoveContainer" containerID="37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9" Apr 24 14:27:52.200045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.200029 2574 scope.go:117] "RemoveContainer" containerID="d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e" Apr 24 14:27:52.206833 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.206811 2574 scope.go:117] "RemoveContainer" containerID="d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca" Apr 24 14:27:52.213194 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.213180 2574 scope.go:117] "RemoveContainer" containerID="e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc" Apr 24 14:27:52.213434 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:27:52.213415 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc\": container with ID starting with e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc not found: ID does not exist" containerID="e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc" Apr 24 14:27:52.213488 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.213443 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc"} err="failed to get container status \"e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc\": rpc error: code = NotFound desc = could not find container \"e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc\": container with ID starting with e601ffe86ea92c9f64847758fc8cea5cc1821f86dd5201e971c41b23c832e4fc not found: ID does not exist" Apr 24 14:27:52.213488 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.213460 2574 scope.go:117] "RemoveContainer" containerID="d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f" Apr 24 14:27:52.213676 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:27:52.213662 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f\": container with ID starting with d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f not found: ID does not exist" containerID="d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f" Apr 24 14:27:52.213719 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.213679 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f"} err="failed to get container status \"d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f\": rpc error: code = NotFound desc = could not find container \"d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f\": container with ID starting with d0305280267eb7812a50d3b2a8e41784ec1b489929a0afd4a134052cecac297f not found: ID does not exist" Apr 24 14:27:52.213719 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.213691 2574 scope.go:117] "RemoveContainer" containerID="d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a" Apr 24 14:27:52.213912 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:27:52.213892 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a\": container with ID starting with d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a not found: ID does not exist" containerID="d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a" Apr 24 14:27:52.213953 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.213916 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a"} err="failed to get container status \"d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a\": rpc error: code = NotFound desc = could not find container \"d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a\": container with ID starting with d1c04aca3766264d186a42c4d54c4a5be19d686b82f4af776c65555dba3c0c3a not found: ID does not exist" Apr 24 14:27:52.213953 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.213932 2574 scope.go:117] "RemoveContainer" containerID="aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f" Apr 24 14:27:52.214144 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:27:52.214126 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f\": container with ID starting with aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f not found: ID does not exist" containerID="aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f" Apr 24 14:27:52.214184 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.214150 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f"} err="failed to get container status \"aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f\": rpc error: code = NotFound desc = could not find container \"aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f\": container with ID starting with aed7b910ed3dc245bdcb1be26b1d76371199864d1a707b1fee85cb054c77a62f not found: ID does not exist" Apr 24 14:27:52.214184 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.214164 2574 scope.go:117] "RemoveContainer" containerID="37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9" Apr 24 14:27:52.214363 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:27:52.214347 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9\": container with ID starting with 37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9 not found: ID does not exist" containerID="37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9" Apr 24 14:27:52.214411 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.214366 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9"} err="failed to get container status \"37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9\": rpc error: code = NotFound desc = could not find container \"37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9\": container with ID starting with 37fd1406b91ceb8ed6656b082f7461b22d39d6100b4655a25bcca1cdb6c63cb9 not found: ID does not exist" Apr 24 14:27:52.214411 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.214380 2574 scope.go:117] "RemoveContainer" containerID="d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e" Apr 24 14:27:52.214600 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:27:52.214585 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e\": container with ID starting with d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e not found: ID does not exist" containerID="d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e" Apr 24 14:27:52.214636 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.214603 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e"} err="failed to get container status \"d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e\": rpc error: code = NotFound desc = could not find container \"d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e\": container with ID starting with d840ca29c0bfe3c714623f081a777ab5d01d18d8409ceb4a3312d2ec53564e1e not found: ID does not exist" Apr 24 14:27:52.214636 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.214615 2574 scope.go:117] "RemoveContainer" containerID="d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca" Apr 24 14:27:52.214802 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:27:52.214787 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca\": container with ID starting with d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca not found: ID does not exist" containerID="d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca" Apr 24 14:27:52.214846 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.214808 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca"} err="failed to get container status \"d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca\": rpc error: code = NotFound desc = could not find container \"d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca\": container with ID starting with d452626b8ba1262a2ab644345db1a7fb747c82fa608361b38fe8e9ba8b9e1aca not found: ID does not exist" Apr 24 14:27:52.251113 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251084 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-main-tls\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251226 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251127 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-main-db\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251226 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251162 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-volume\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251226 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251197 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251226 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251223 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251464 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251255 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-metrics-client-ca\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251464 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251293 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-out\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251464 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251321 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-web-config\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251464 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251346 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfpw7\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-kube-api-access-dfpw7\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251464 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251411 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-web\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251464 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251446 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-metric\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251735 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251480 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-tls-assets\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251735 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251506 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-cluster-tls-config\") pod \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\" (UID: \"7503d8e2-7236-48d2-b4d5-1cd9cdc2da28\") " Apr 24 14:27:52.251735 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251635 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-service-ca\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.251735 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-serving-cert\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.251735 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-trusted-ca-bundle\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.251971 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251709 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:27:52.251971 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-config\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.251971 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-oauth-config\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.251971 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m77c8\" (UniqueName: \"kubernetes.io/projected/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-kube-api-access-m77c8\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.251971 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-oauth-serving-cert\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.252212 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.251984 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-main-db\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.252212 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.252090 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:52.252631 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.252602 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:52.253911 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.253872 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:52.254020 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.253916 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-volume" (OuterVolumeSpecName: "config-volume") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:52.254817 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.254778 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:52.254903 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.254810 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-out" (OuterVolumeSpecName: "config-out") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:27:52.254973 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.254946 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:52.255030 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.254999 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:52.255137 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.255119 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:52.255790 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.255770 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-kube-api-access-dfpw7" (OuterVolumeSpecName: "kube-api-access-dfpw7") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "kube-api-access-dfpw7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:52.259296 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.259158 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:52.265875 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.265855 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-web-config" (OuterVolumeSpecName: "web-config") pod "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" (UID: "7503d8e2-7236-48d2-b4d5-1cd9cdc2da28"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:52.352752 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.352722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-service-ca\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.352868 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.352765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-serving-cert\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.352868 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.352797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-trusted-ca-bundle\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.352868 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.352837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-config\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.353028 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.352876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-oauth-config\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.353028 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.352903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m77c8\" (UniqueName: \"kubernetes.io/projected/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-kube-api-access-m77c8\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.353028 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.352951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-oauth-serving-cert\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.353028 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353004 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-volume\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353028 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353021 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353036 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353051 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-metrics-client-ca\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353068 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-config-out\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353082 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-web-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353096 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dfpw7\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-kube-api-access-dfpw7\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353112 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353128 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353144 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-tls-assets\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353160 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-cluster-tls-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353268 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353175 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28-secret-alertmanager-main-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:27:52.353787 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-service-ca\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.353840 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353787 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-oauth-serving-cert\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.353986 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.353955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-config\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.354045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.354023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-trusted-ca-bundle\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.354961 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.354940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-serving-cert\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.355208 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.355190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-oauth-config\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.360427 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.360383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m77c8\" (UniqueName: \"kubernetes.io/projected/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-kube-api-access-m77c8\") pod \"console-ddf66748b-4gqzz\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.470364 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.469212 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:27:52.487508 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.487484 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:52.491877 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.491846 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:52.520544 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.518134 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:52.525068 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.524978 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.528045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.527681 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 14:27:52.528045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.527710 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 14:27:52.528045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.527744 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 14:27:52.528045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.527819 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-876pr\"" Apr 24 14:27:52.528045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.527896 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 14:27:52.528045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.527974 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 14:27:52.528457 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.528193 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 14:27:52.528457 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.528244 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 14:27:52.528457 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.528313 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 14:27:52.538320 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.537507 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 14:27:52.540121 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.540094 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:52.608796 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.608758 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ddf66748b-4gqzz"] Apr 24 14:27:52.611459 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:27:52.611435 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30c1dd8_3f26_4f33_a121_f2db2dad2baf.slice/crio-92698f95121f0320cd662a46211d5489009a27475252c310897c1a2568334838 WatchSource:0}: Error finding container 92698f95121f0320cd662a46211d5489009a27475252c310897c1a2568334838: Status 404 returned error can't find the container with id 92698f95121f0320cd662a46211d5489009a27475252c310897c1a2568334838 Apr 24 14:27:52.654806 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.654785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-web-config\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.654896 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.654820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.654896 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.654838 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.654986 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.654912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8715f464-8cc2-459a-8616-97623080dd16-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.654986 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.654965 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.655045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.654987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-config-volume\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.655045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.655008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.655045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.655031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8715f464-8cc2-459a-8616-97623080dd16-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.655133 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.655057 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.655133 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.655109 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8715f464-8cc2-459a-8616-97623080dd16-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.655193 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.655166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjtvf\" (UniqueName: \"kubernetes.io/projected/8715f464-8cc2-459a-8616-97623080dd16-kube-api-access-vjtvf\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.655226 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.655198 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8715f464-8cc2-459a-8616-97623080dd16-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.655256 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.655233 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8715f464-8cc2-459a-8616-97623080dd16-config-out\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8715f464-8cc2-459a-8616-97623080dd16-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756310 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-config-volume\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756539 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756539 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8715f464-8cc2-459a-8616-97623080dd16-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756539 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756679 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8715f464-8cc2-459a-8616-97623080dd16-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756731 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjtvf\" (UniqueName: \"kubernetes.io/projected/8715f464-8cc2-459a-8616-97623080dd16-kube-api-access-vjtvf\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756785 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756741 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8715f464-8cc2-459a-8616-97623080dd16-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756835 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8715f464-8cc2-459a-8616-97623080dd16-config-out\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756886 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-web-config\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756939 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.756939 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.756921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.757332 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.757148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8715f464-8cc2-459a-8616-97623080dd16-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.757332 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.757164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8715f464-8cc2-459a-8616-97623080dd16-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.757546 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.757466 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8715f464-8cc2-459a-8616-97623080dd16-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.759476 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.759421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-config-volume\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.759713 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.759682 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.760024 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.759733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8715f464-8cc2-459a-8616-97623080dd16-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.760159 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.760138 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8715f464-8cc2-459a-8616-97623080dd16-config-out\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.760235 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.760181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.760290 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.760249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.760339 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.760314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.760536 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.760517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.761348 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.761332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8715f464-8cc2-459a-8616-97623080dd16-web-config\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.764165 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.764144 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjtvf\" (UniqueName: \"kubernetes.io/projected/8715f464-8cc2-459a-8616-97623080dd16-kube-api-access-vjtvf\") pod \"alertmanager-main-0\" (UID: \"8715f464-8cc2-459a-8616-97623080dd16\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.839605 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.839582 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 14:27:52.961790 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:52.961766 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 14:27:52.965037 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:27:52.965009 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8715f464_8cc2_459a_8616_97623080dd16.slice/crio-a6a3df69e3cc9712d4b751587c5f12a9598707eb4aacf3ee7597f180ce1e490c WatchSource:0}: Error finding container a6a3df69e3cc9712d4b751587c5f12a9598707eb4aacf3ee7597f180ce1e490c: Status 404 returned error can't find the container with id a6a3df69e3cc9712d4b751587c5f12a9598707eb4aacf3ee7597f180ce1e490c Apr 24 14:27:53.164977 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:53.164946 2574 generic.go:358] "Generic (PLEG): container finished" podID="8715f464-8cc2-459a-8616-97623080dd16" containerID="1cd0dd706b6c1837fccc668c9f8d87786875285defa5ef1561bee8b1b228eb40" exitCode=0 Apr 24 14:27:53.165330 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:53.165021 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8715f464-8cc2-459a-8616-97623080dd16","Type":"ContainerDied","Data":"1cd0dd706b6c1837fccc668c9f8d87786875285defa5ef1561bee8b1b228eb40"} Apr 24 14:27:53.165330 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:53.165043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8715f464-8cc2-459a-8616-97623080dd16","Type":"ContainerStarted","Data":"a6a3df69e3cc9712d4b751587c5f12a9598707eb4aacf3ee7597f180ce1e490c"} Apr 24 14:27:53.166364 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:53.166340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ddf66748b-4gqzz" event={"ID":"c30c1dd8-3f26-4f33-a121-f2db2dad2baf","Type":"ContainerStarted","Data":"dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a"} Apr 24 14:27:53.166489 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:53.166369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ddf66748b-4gqzz" event={"ID":"c30c1dd8-3f26-4f33-a121-f2db2dad2baf","Type":"ContainerStarted","Data":"92698f95121f0320cd662a46211d5489009a27475252c310897c1a2568334838"} Apr 24 14:27:53.204011 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:53.203972 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-ddf66748b-4gqzz" podStartSLOduration=1.203958372 podStartE2EDuration="1.203958372s" podCreationTimestamp="2026-04-24 14:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:27:53.202485325 +0000 UTC m=+236.414336699" watchObservedRunningTime="2026-04-24 14:27:53.203958372 +0000 UTC m=+236.415809748" Apr 24 14:27:53.359863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:53.359835 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7503d8e2-7236-48d2-b4d5-1cd9cdc2da28" path="/var/lib/kubelet/pods/7503d8e2-7236-48d2-b4d5-1cd9cdc2da28/volumes" Apr 24 14:27:54.176684 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:54.176650 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8715f464-8cc2-459a-8616-97623080dd16","Type":"ContainerStarted","Data":"289ceaf097e38a6d14a8232dff1740c8cac9fdd629c1b6347f3e9847d918acd4"} Apr 24 14:27:54.176684 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:54.176689 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8715f464-8cc2-459a-8616-97623080dd16","Type":"ContainerStarted","Data":"854bce59fb1790efa946057843d7d7da6f0baf6b0ab3b2ab42dfd2819cef3f88"} Apr 24 14:27:54.176684 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:54.176699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8715f464-8cc2-459a-8616-97623080dd16","Type":"ContainerStarted","Data":"e2f7fd9761a5c9fa138d3de0f710ec4f6b767f330c2542d4de5e5c1dfebc1581"} Apr 24 14:27:54.177123 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:54.176708 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8715f464-8cc2-459a-8616-97623080dd16","Type":"ContainerStarted","Data":"a9188d58a5dca6d17683b6d674d345ae63b54f1f96ac43f5e4e06a220e45686a"} Apr 24 14:27:54.177123 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:54.176715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8715f464-8cc2-459a-8616-97623080dd16","Type":"ContainerStarted","Data":"55d50e1aa0280a5d250f9c13e7c7cb52c2323c63592f69d0a9613504da3eab0f"} Apr 24 14:27:54.177123 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:54.176723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8715f464-8cc2-459a-8616-97623080dd16","Type":"ContainerStarted","Data":"4acb1da6701dffc312a7e055813637dd581ef3c9f66d186a879778194ece9f4e"} Apr 24 14:27:54.202298 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:27:54.202207 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.202165019 podStartE2EDuration="2.202165019s" podCreationTimestamp="2026-04-24 14:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:27:54.200621793 +0000 UTC m=+237.412473169" watchObservedRunningTime="2026-04-24 14:27:54.202165019 +0000 UTC m=+237.414016395" Apr 24 14:28:02.470889 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:02.470855 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:28:02.470889 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:02.470892 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:28:02.475656 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:02.475631 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:28:03.208963 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:03.208937 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:28:03.252008 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:03.251980 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59f697d754-wgwds"] Apr 24 14:28:27.962871 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:27.962791 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bxsfm"] Apr 24 14:28:27.968608 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:27.968581 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:27.970637 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:27.970618 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:28:27.973581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:27.973560 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bxsfm"] Apr 24 14:28:28.031715 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.031678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5a7584ad-141f-4b72-9c3f-f44d38325431-original-pull-secret\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.031882 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.031745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5a7584ad-141f-4b72-9c3f-f44d38325431-dbus\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.031882 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.031860 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5a7584ad-141f-4b72-9c3f-f44d38325431-kubelet-config\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.132530 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.132495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5a7584ad-141f-4b72-9c3f-f44d38325431-kubelet-config\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.132530 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.132530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5a7584ad-141f-4b72-9c3f-f44d38325431-original-pull-secret\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.132742 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.132580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5a7584ad-141f-4b72-9c3f-f44d38325431-dbus\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.132742 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.132629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5a7584ad-141f-4b72-9c3f-f44d38325431-kubelet-config\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.132742 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.132715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5a7584ad-141f-4b72-9c3f-f44d38325431-dbus\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.134861 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.134840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5a7584ad-141f-4b72-9c3f-f44d38325431-original-pull-secret\") pod \"global-pull-secret-syncer-bxsfm\" (UID: \"5a7584ad-141f-4b72-9c3f-f44d38325431\") " pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.272272 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.272161 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59f697d754-wgwds" podUID="d265cb03-77c2-4e91-ba30-777af584dfd1" containerName="console" containerID="cri-o://5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981" gracePeriod=15 Apr 24 14:28:28.277439 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.277389 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxsfm" Apr 24 14:28:28.394759 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.394730 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bxsfm"] Apr 24 14:28:28.397946 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:28:28.397919 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7584ad_141f_4b72_9c3f_f44d38325431.slice/crio-01ae6fe8d3e7934b56738632047591932446853a90a085a48c0de39f9172504d WatchSource:0}: Error finding container 01ae6fe8d3e7934b56738632047591932446853a90a085a48c0de39f9172504d: Status 404 returned error can't find the container with id 01ae6fe8d3e7934b56738632047591932446853a90a085a48c0de39f9172504d Apr 24 14:28:28.508582 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.508562 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59f697d754-wgwds_d265cb03-77c2-4e91-ba30-777af584dfd1/console/0.log" Apr 24 14:28:28.508705 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.508617 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:28:28.636767 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.636736 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8rjp\" (UniqueName: \"kubernetes.io/projected/d265cb03-77c2-4e91-ba30-777af584dfd1-kube-api-access-w8rjp\") pod \"d265cb03-77c2-4e91-ba30-777af584dfd1\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " Apr 24 14:28:28.636967 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.636773 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-serving-cert\") pod \"d265cb03-77c2-4e91-ba30-777af584dfd1\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " Apr 24 14:28:28.636967 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.636799 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-oauth-serving-cert\") pod \"d265cb03-77c2-4e91-ba30-777af584dfd1\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " Apr 24 14:28:28.636967 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.636817 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-service-ca\") pod \"d265cb03-77c2-4e91-ba30-777af584dfd1\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " Apr 24 14:28:28.636967 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.636853 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-console-config\") pod \"d265cb03-77c2-4e91-ba30-777af584dfd1\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " Apr 24 14:28:28.636967 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.636878 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-trusted-ca-bundle\") pod \"d265cb03-77c2-4e91-ba30-777af584dfd1\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " Apr 24 14:28:28.636967 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.636895 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-oauth-config\") pod \"d265cb03-77c2-4e91-ba30-777af584dfd1\" (UID: \"d265cb03-77c2-4e91-ba30-777af584dfd1\") " Apr 24 14:28:28.637279 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.637258 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d265cb03-77c2-4e91-ba30-777af584dfd1" (UID: "d265cb03-77c2-4e91-ba30-777af584dfd1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:28.637334 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.637261 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-console-config" (OuterVolumeSpecName: "console-config") pod "d265cb03-77c2-4e91-ba30-777af584dfd1" (UID: "d265cb03-77c2-4e91-ba30-777af584dfd1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:28.637381 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.637336 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d265cb03-77c2-4e91-ba30-777af584dfd1" (UID: "d265cb03-77c2-4e91-ba30-777af584dfd1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:28.637381 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.637349 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-service-ca" (OuterVolumeSpecName: "service-ca") pod "d265cb03-77c2-4e91-ba30-777af584dfd1" (UID: "d265cb03-77c2-4e91-ba30-777af584dfd1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:28:28.638989 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.638965 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d265cb03-77c2-4e91-ba30-777af584dfd1" (UID: "d265cb03-77c2-4e91-ba30-777af584dfd1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:28.639086 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.639014 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d265cb03-77c2-4e91-ba30-777af584dfd1-kube-api-access-w8rjp" (OuterVolumeSpecName: "kube-api-access-w8rjp") pod "d265cb03-77c2-4e91-ba30-777af584dfd1" (UID: "d265cb03-77c2-4e91-ba30-777af584dfd1"). InnerVolumeSpecName "kube-api-access-w8rjp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:28:28.639086 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.639021 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d265cb03-77c2-4e91-ba30-777af584dfd1" (UID: "d265cb03-77c2-4e91-ba30-777af584dfd1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:28:28.738149 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.738113 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8rjp\" (UniqueName: \"kubernetes.io/projected/d265cb03-77c2-4e91-ba30-777af584dfd1-kube-api-access-w8rjp\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:28:28.738149 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.738142 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:28:28.738149 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.738152 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-oauth-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:28:28.738475 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.738162 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-service-ca\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:28:28.738475 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.738172 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-console-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:28:28.738475 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.738180 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d265cb03-77c2-4e91-ba30-777af584dfd1-trusted-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:28:28.738475 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:28.738190 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d265cb03-77c2-4e91-ba30-777af584dfd1-console-oauth-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:28:29.282184 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.282156 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59f697d754-wgwds_d265cb03-77c2-4e91-ba30-777af584dfd1/console/0.log" Apr 24 14:28:29.282607 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.282199 2574 generic.go:358] "Generic (PLEG): container finished" podID="d265cb03-77c2-4e91-ba30-777af584dfd1" containerID="5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981" exitCode=2 Apr 24 14:28:29.282607 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.282258 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f697d754-wgwds" Apr 24 14:28:29.282607 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.282287 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f697d754-wgwds" event={"ID":"d265cb03-77c2-4e91-ba30-777af584dfd1","Type":"ContainerDied","Data":"5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981"} Apr 24 14:28:29.282607 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.282332 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f697d754-wgwds" event={"ID":"d265cb03-77c2-4e91-ba30-777af584dfd1","Type":"ContainerDied","Data":"3c3dfe019e50290c37846e2c23c12aae8518bb11241776f793462cf4dd36edb0"} Apr 24 14:28:29.282607 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.282354 2574 scope.go:117] "RemoveContainer" containerID="5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981" Apr 24 14:28:29.283782 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.283762 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bxsfm" event={"ID":"5a7584ad-141f-4b72-9c3f-f44d38325431","Type":"ContainerStarted","Data":"01ae6fe8d3e7934b56738632047591932446853a90a085a48c0de39f9172504d"} Apr 24 14:28:29.290729 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.290715 2574 scope.go:117] "RemoveContainer" containerID="5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981" Apr 24 14:28:29.290967 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:28:29.290950 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981\": container with ID starting with 5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981 not found: ID does not exist" containerID="5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981" Apr 24 14:28:29.291014 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.290975 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981"} err="failed to get container status \"5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981\": rpc error: code = NotFound desc = could not find container \"5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981\": container with ID starting with 5119d5a54c2f15e7ae6046da6bccb758c3fe5104cedd192ddf1f772a1f57d981 not found: ID does not exist" Apr 24 14:28:29.301911 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.301892 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59f697d754-wgwds"] Apr 24 14:28:29.305958 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.305934 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59f697d754-wgwds"] Apr 24 14:28:29.359173 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:29.359127 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d265cb03-77c2-4e91-ba30-777af584dfd1" path="/var/lib/kubelet/pods/d265cb03-77c2-4e91-ba30-777af584dfd1/volumes" Apr 24 14:28:35.305084 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:35.305048 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bxsfm" event={"ID":"5a7584ad-141f-4b72-9c3f-f44d38325431","Type":"ContainerStarted","Data":"0c2a8c909db5a490e563d29ddf1e483b7c3e671f85b7926f23e39dc17fde8432"} Apr 24 14:28:35.320603 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:35.320557 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bxsfm" podStartSLOduration=2.2552013779999998 podStartE2EDuration="8.320544864s" podCreationTimestamp="2026-04-24 14:28:27 +0000 UTC" firstStartedPulling="2026-04-24 14:28:28.399737467 +0000 UTC m=+271.611588820" lastFinishedPulling="2026-04-24 14:28:34.465080949 +0000 UTC m=+277.676932306" observedRunningTime="2026-04-24 14:28:35.318451777 +0000 UTC m=+278.530303149" watchObservedRunningTime="2026-04-24 14:28:35.320544864 +0000 UTC m=+278.532396238" Apr 24 14:28:57.245860 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:57.245827 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:28:57.249520 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:57.249489 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:28:57.256911 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:28:57.256882 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:29:11.328036 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.327999 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8"] Apr 24 14:29:11.331217 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.328308 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d265cb03-77c2-4e91-ba30-777af584dfd1" containerName="console" Apr 24 14:29:11.331217 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.328319 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d265cb03-77c2-4e91-ba30-777af584dfd1" containerName="console" Apr 24 14:29:11.331217 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.328371 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d265cb03-77c2-4e91-ba30-777af584dfd1" containerName="console" Apr 24 14:29:11.332292 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.332277 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.334503 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.334478 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 14:29:11.334898 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.334881 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 14:29:11.334947 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.334930 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rkrcv\"" Apr 24 14:29:11.338183 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.338160 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8"] Apr 24 14:29:11.387254 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.387224 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.387390 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.387345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffck9\" (UniqueName: \"kubernetes.io/projected/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-kube-api-access-ffck9\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.387390 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.387390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.488425 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.488380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.488581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.488478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffck9\" (UniqueName: \"kubernetes.io/projected/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-kube-api-access-ffck9\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.488581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.488512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.488822 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.488798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.488822 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.488814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.496597 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.496576 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffck9\" (UniqueName: \"kubernetes.io/projected/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-kube-api-access-ffck9\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.642805 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.642718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:11.755279 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.755250 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8"] Apr 24 14:29:11.758333 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:29:11.758306 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25a1afd1_9ecc_46ce_b1b6_fbe353fa1770.slice/crio-487d8301ef0c3001d7e4cc6697be1d46ef7b10fc33f9e108fec23c04397bc227 WatchSource:0}: Error finding container 487d8301ef0c3001d7e4cc6697be1d46ef7b10fc33f9e108fec23c04397bc227: Status 404 returned error can't find the container with id 487d8301ef0c3001d7e4cc6697be1d46ef7b10fc33f9e108fec23c04397bc227 Apr 24 14:29:11.760157 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:11.760141 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:29:12.413566 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:12.413536 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" event={"ID":"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770","Type":"ContainerStarted","Data":"487d8301ef0c3001d7e4cc6697be1d46ef7b10fc33f9e108fec23c04397bc227"} Apr 24 14:29:17.431380 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:17.431347 2574 generic.go:358] "Generic (PLEG): container finished" podID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerID="521b08c2ce3bb164baff2c8efd6bf5ec873b5a546f8b88ea6710c20fe8b14177" exitCode=0 Apr 24 14:29:17.431796 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:17.431428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" event={"ID":"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770","Type":"ContainerDied","Data":"521b08c2ce3bb164baff2c8efd6bf5ec873b5a546f8b88ea6710c20fe8b14177"} Apr 24 14:29:20.443970 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:20.443930 2574 generic.go:358] "Generic (PLEG): container finished" podID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerID="47cecdd71cacf72a5677d4f92abbcb887d945d1fa970d9f594a4b6001647c569" exitCode=0 Apr 24 14:29:20.444329 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:20.444021 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" event={"ID":"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770","Type":"ContainerDied","Data":"47cecdd71cacf72a5677d4f92abbcb887d945d1fa970d9f594a4b6001647c569"} Apr 24 14:29:27.466831 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:27.466796 2574 generic.go:358] "Generic (PLEG): container finished" podID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerID="3adc1df3479ae7e1a6ea37d3f024afae8a6ef861f5183b87b21e006225524ad4" exitCode=0 Apr 24 14:29:27.467219 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:27.466860 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" event={"ID":"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770","Type":"ContainerDied","Data":"3adc1df3479ae7e1a6ea37d3f024afae8a6ef861f5183b87b21e006225524ad4"} Apr 24 14:29:28.589978 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.589956 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:28.638829 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.638795 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-bundle\") pod \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " Apr 24 14:29:28.638978 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.638849 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffck9\" (UniqueName: \"kubernetes.io/projected/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-kube-api-access-ffck9\") pod \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " Apr 24 14:29:28.638978 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.638919 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-util\") pod \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\" (UID: \"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770\") " Apr 24 14:29:28.639549 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.639518 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-bundle" (OuterVolumeSpecName: "bundle") pod "25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" (UID: "25a1afd1-9ecc-46ce-b1b6-fbe353fa1770"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:29:28.641004 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.640974 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-kube-api-access-ffck9" (OuterVolumeSpecName: "kube-api-access-ffck9") pod "25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" (UID: "25a1afd1-9ecc-46ce-b1b6-fbe353fa1770"). InnerVolumeSpecName "kube-api-access-ffck9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:29:28.643160 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.643139 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-util" (OuterVolumeSpecName: "util") pod "25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" (UID: "25a1afd1-9ecc-46ce-b1b6-fbe353fa1770"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:29:28.740449 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.740360 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:29:28.740449 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.740384 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ffck9\" (UniqueName: \"kubernetes.io/projected/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-kube-api-access-ffck9\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:29:28.740449 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:28.740412 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25a1afd1-9ecc-46ce-b1b6-fbe353fa1770-util\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:29:29.474360 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:29.474331 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" event={"ID":"25a1afd1-9ecc-46ce-b1b6-fbe353fa1770","Type":"ContainerDied","Data":"487d8301ef0c3001d7e4cc6697be1d46ef7b10fc33f9e108fec23c04397bc227"} Apr 24 14:29:29.474360 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:29.474351 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c22pj8" Apr 24 14:29:29.474360 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:29.474364 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="487d8301ef0c3001d7e4cc6697be1d46ef7b10fc33f9e108fec23c04397bc227" Apr 24 14:29:32.962115 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:32.962083 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls"] Apr 24 14:29:32.962718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:32.962441 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerName="pull" Apr 24 14:29:32.962718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:32.962454 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerName="pull" Apr 24 14:29:32.962718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:32.962472 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerName="util" Apr 24 14:29:32.962718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:32.962477 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerName="util" Apr 24 14:29:32.962718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:32.962490 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerName="extract" Apr 24 14:29:32.962718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:32.962495 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerName="extract" Apr 24 14:29:32.962718 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:32.962542 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="25a1afd1-9ecc-46ce-b1b6-fbe353fa1770" containerName="extract" Apr 24 14:29:33.014323 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.014300 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls"] Apr 24 14:29:33.014506 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.014432 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:33.016738 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.016717 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 14:29:33.016831 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.016817 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 14:29:33.016881 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.016835 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 14:29:33.016881 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.016872 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-dlxcz\"" Apr 24 14:29:33.075225 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.075199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfxmv\" (UniqueName: \"kubernetes.io/projected/a6297980-689f-4254-aedb-beeef903833d-kube-api-access-kfxmv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-h8hls\" (UID: \"a6297980-689f-4254-aedb-beeef903833d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:33.075343 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.075296 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a6297980-689f-4254-aedb-beeef903833d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-h8hls\" (UID: \"a6297980-689f-4254-aedb-beeef903833d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:33.176605 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.176576 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a6297980-689f-4254-aedb-beeef903833d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-h8hls\" (UID: \"a6297980-689f-4254-aedb-beeef903833d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:33.176745 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.176620 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfxmv\" (UniqueName: \"kubernetes.io/projected/a6297980-689f-4254-aedb-beeef903833d-kube-api-access-kfxmv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-h8hls\" (UID: \"a6297980-689f-4254-aedb-beeef903833d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:33.178943 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.178922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a6297980-689f-4254-aedb-beeef903833d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-h8hls\" (UID: \"a6297980-689f-4254-aedb-beeef903833d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:33.184161 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.184135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfxmv\" (UniqueName: \"kubernetes.io/projected/a6297980-689f-4254-aedb-beeef903833d-kube-api-access-kfxmv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-h8hls\" (UID: \"a6297980-689f-4254-aedb-beeef903833d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:33.324669 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.324643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:33.457260 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.457223 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls"] Apr 24 14:29:33.460389 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:29:33.460364 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6297980_689f_4254_aedb_beeef903833d.slice/crio-5a066b5bbf92ec5f1e9753acd67b5af7fe611bb64fed5173f24820faeab87c05 WatchSource:0}: Error finding container 5a066b5bbf92ec5f1e9753acd67b5af7fe611bb64fed5173f24820faeab87c05: Status 404 returned error can't find the container with id 5a066b5bbf92ec5f1e9753acd67b5af7fe611bb64fed5173f24820faeab87c05 Apr 24 14:29:33.487127 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:33.487100 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" event={"ID":"a6297980-689f-4254-aedb-beeef903833d","Type":"ContainerStarted","Data":"5a066b5bbf92ec5f1e9753acd67b5af7fe611bb64fed5173f24820faeab87c05"} Apr 24 14:29:37.440136 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.440106 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rt7xr"] Apr 24 14:29:37.446727 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.446703 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:37.448624 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.448587 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 14:29:37.449505 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.449469 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-8cjc6\"" Apr 24 14:29:37.449645 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.449525 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 14:29:37.450450 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.450428 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rt7xr"] Apr 24 14:29:37.502682 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.502648 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" event={"ID":"a6297980-689f-4254-aedb-beeef903833d","Type":"ContainerStarted","Data":"547b0f41e3c794ecafc8cdf9e991abb37a2e1f7eccb41890cc8302ecbf9283de"} Apr 24 14:29:37.502850 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.502829 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:29:37.517298 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.517272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrdq\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-kube-api-access-7wrdq\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:37.517442 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.517311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/309a50a5-4abe-44d5-9afa-e2e0395034d1-cabundle0\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:37.517442 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.517376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:37.519008 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.518945 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" podStartSLOduration=2.034412491 podStartE2EDuration="5.518929018s" podCreationTimestamp="2026-04-24 14:29:32 +0000 UTC" firstStartedPulling="2026-04-24 14:29:33.462167702 +0000 UTC m=+336.674019061" lastFinishedPulling="2026-04-24 14:29:36.946684224 +0000 UTC m=+340.158535588" observedRunningTime="2026-04-24 14:29:37.517847726 +0000 UTC m=+340.729699101" watchObservedRunningTime="2026-04-24 14:29:37.518929018 +0000 UTC m=+340.730780395" Apr 24 14:29:37.618385 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.618357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrdq\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-kube-api-access-7wrdq\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:37.618540 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.618496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/309a50a5-4abe-44d5-9afa-e2e0395034d1-cabundle0\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:37.618580 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.618561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:37.618712 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:37.618697 2574 secret.go:281] references non-existent secret key: ca.crt Apr 24 14:29:37.618744 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:37.618719 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 14:29:37.618744 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:37.618730 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rt7xr: references non-existent secret key: ca.crt Apr 24 14:29:37.618808 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:37.618790 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates podName:309a50a5-4abe-44d5-9afa-e2e0395034d1 nodeName:}" failed. No retries permitted until 2026-04-24 14:29:38.118774562 +0000 UTC m=+341.330625921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates") pod "keda-operator-ffbb595cb-rt7xr" (UID: "309a50a5-4abe-44d5-9afa-e2e0395034d1") : references non-existent secret key: ca.crt Apr 24 14:29:37.619108 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.619091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/309a50a5-4abe-44d5-9afa-e2e0395034d1-cabundle0\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:37.628222 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:37.628204 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrdq\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-kube-api-access-7wrdq\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:38.123750 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:38.123715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:38.123916 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:38.123835 2574 secret.go:281] references non-existent secret key: ca.crt Apr 24 14:29:38.123916 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:38.123846 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 14:29:38.123916 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:38.123855 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rt7xr: references non-existent secret key: ca.crt Apr 24 14:29:38.123916 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:38.123909 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates podName:309a50a5-4abe-44d5-9afa-e2e0395034d1 nodeName:}" failed. No retries permitted until 2026-04-24 14:29:39.123896307 +0000 UTC m=+342.335747660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates") pod "keda-operator-ffbb595cb-rt7xr" (UID: "309a50a5-4abe-44d5-9afa-e2e0395034d1") : references non-existent secret key: ca.crt Apr 24 14:29:39.133020 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:39.132978 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:39.133697 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:39.133133 2574 secret.go:281] references non-existent secret key: ca.crt Apr 24 14:29:39.133697 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:39.133154 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 14:29:39.133697 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:39.133167 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rt7xr: references non-existent secret key: ca.crt Apr 24 14:29:39.133697 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:39.133221 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates podName:309a50a5-4abe-44d5-9afa-e2e0395034d1 nodeName:}" failed. No retries permitted until 2026-04-24 14:29:41.133205074 +0000 UTC m=+344.345056427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates") pod "keda-operator-ffbb595cb-rt7xr" (UID: "309a50a5-4abe-44d5-9afa-e2e0395034d1") : references non-existent secret key: ca.crt Apr 24 14:29:41.152220 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:41.152185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:41.152730 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:41.152305 2574 secret.go:281] references non-existent secret key: ca.crt Apr 24 14:29:41.152730 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:41.152317 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 14:29:41.152730 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:41.152325 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rt7xr: references non-existent secret key: ca.crt Apr 24 14:29:41.152730 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:29:41.152385 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates podName:309a50a5-4abe-44d5-9afa-e2e0395034d1 nodeName:}" failed. No retries permitted until 2026-04-24 14:29:45.152367452 +0000 UTC m=+348.364218805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates") pod "keda-operator-ffbb595cb-rt7xr" (UID: "309a50a5-4abe-44d5-9afa-e2e0395034d1") : references non-existent secret key: ca.crt Apr 24 14:29:45.187916 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:45.187877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:45.190197 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:45.190178 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/309a50a5-4abe-44d5-9afa-e2e0395034d1-certificates\") pod \"keda-operator-ffbb595cb-rt7xr\" (UID: \"309a50a5-4abe-44d5-9afa-e2e0395034d1\") " pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:45.273358 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:45.273328 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:45.388832 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:45.388801 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rt7xr"] Apr 24 14:29:45.392078 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:29:45.392042 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod309a50a5_4abe_44d5_9afa_e2e0395034d1.slice/crio-a33ae5087e631a4f71e4174c26829aca1e9279ab79dc859abfd0be357add5254 WatchSource:0}: Error finding container a33ae5087e631a4f71e4174c26829aca1e9279ab79dc859abfd0be357add5254: Status 404 returned error can't find the container with id a33ae5087e631a4f71e4174c26829aca1e9279ab79dc859abfd0be357add5254 Apr 24 14:29:45.532690 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:45.532606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" event={"ID":"309a50a5-4abe-44d5-9afa-e2e0395034d1","Type":"ContainerStarted","Data":"a33ae5087e631a4f71e4174c26829aca1e9279ab79dc859abfd0be357add5254"} Apr 24 14:29:48.544473 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:48.544371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" event={"ID":"309a50a5-4abe-44d5-9afa-e2e0395034d1","Type":"ContainerStarted","Data":"0ad2236b088ec60cd38d575700f2038af5786506acfe10363733811441d1807b"} Apr 24 14:29:48.544824 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:48.544537 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:29:48.565785 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:48.565726 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" podStartSLOduration=8.84529256 podStartE2EDuration="11.565710242s" podCreationTimestamp="2026-04-24 14:29:37 +0000 UTC" firstStartedPulling="2026-04-24 14:29:45.393279614 +0000 UTC m=+348.605130967" lastFinishedPulling="2026-04-24 14:29:48.113697294 +0000 UTC m=+351.325548649" observedRunningTime="2026-04-24 14:29:48.564751435 +0000 UTC m=+351.776602809" watchObservedRunningTime="2026-04-24 14:29:48.565710242 +0000 UTC m=+351.777561618" Apr 24 14:29:58.507919 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:29:58.507886 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-h8hls" Apr 24 14:30:09.550007 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:09.549963 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-rt7xr" Apr 24 14:30:43.906529 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:43.906491 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d"] Apr 24 14:30:43.913439 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:43.913410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:43.915343 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:43.915321 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 14:30:43.915873 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:43.915857 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 14:30:43.915974 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:43.915899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-ltx6x\"" Apr 24 14:30:43.916036 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:43.915994 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 14:30:43.918206 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:43.918188 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d"] Apr 24 14:30:44.073707 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.073675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29bfh\" (UniqueName: \"kubernetes.io/projected/b3645a36-7d16-485f-9a64-ff6b9ca03d7e-kube-api-access-29bfh\") pod \"llmisvc-controller-manager-68cc5db7c4-sdc5d\" (UID: \"b3645a36-7d16-485f-9a64-ff6b9ca03d7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:44.073707 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.073710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3645a36-7d16-485f-9a64-ff6b9ca03d7e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-sdc5d\" (UID: \"b3645a36-7d16-485f-9a64-ff6b9ca03d7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:44.174832 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.174755 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3645a36-7d16-485f-9a64-ff6b9ca03d7e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-sdc5d\" (UID: \"b3645a36-7d16-485f-9a64-ff6b9ca03d7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:44.174964 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.174867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29bfh\" (UniqueName: \"kubernetes.io/projected/b3645a36-7d16-485f-9a64-ff6b9ca03d7e-kube-api-access-29bfh\") pod \"llmisvc-controller-manager-68cc5db7c4-sdc5d\" (UID: \"b3645a36-7d16-485f-9a64-ff6b9ca03d7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:44.177296 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.177269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3645a36-7d16-485f-9a64-ff6b9ca03d7e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-sdc5d\" (UID: \"b3645a36-7d16-485f-9a64-ff6b9ca03d7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:44.181901 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.181875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29bfh\" (UniqueName: \"kubernetes.io/projected/b3645a36-7d16-485f-9a64-ff6b9ca03d7e-kube-api-access-29bfh\") pod \"llmisvc-controller-manager-68cc5db7c4-sdc5d\" (UID: \"b3645a36-7d16-485f-9a64-ff6b9ca03d7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:44.224037 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.224015 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:44.340493 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.340462 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d"] Apr 24 14:30:44.344667 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:30:44.344632 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb3645a36_7d16_485f_9a64_ff6b9ca03d7e.slice/crio-55377364364ed587ba5e4b1aa132d6fe7e6c9cc9c0f3162382995fdec1906bfd WatchSource:0}: Error finding container 55377364364ed587ba5e4b1aa132d6fe7e6c9cc9c0f3162382995fdec1906bfd: Status 404 returned error can't find the container with id 55377364364ed587ba5e4b1aa132d6fe7e6c9cc9c0f3162382995fdec1906bfd Apr 24 14:30:44.736828 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:44.736791 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" event={"ID":"b3645a36-7d16-485f-9a64-ff6b9ca03d7e","Type":"ContainerStarted","Data":"55377364364ed587ba5e4b1aa132d6fe7e6c9cc9c0f3162382995fdec1906bfd"} Apr 24 14:30:46.744914 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:46.744881 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" event={"ID":"b3645a36-7d16-485f-9a64-ff6b9ca03d7e","Type":"ContainerStarted","Data":"983981136ccbce87c42690e95b8f8673e2c83fe7583adf3838a885117ed7273d"} Apr 24 14:30:46.745294 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:46.745032 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:30:46.762405 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:30:46.762348 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" podStartSLOduration=1.892409612 podStartE2EDuration="3.762335093s" podCreationTimestamp="2026-04-24 14:30:43 +0000 UTC" firstStartedPulling="2026-04-24 14:30:44.345890264 +0000 UTC m=+407.557741618" lastFinishedPulling="2026-04-24 14:30:46.215815741 +0000 UTC m=+409.427667099" observedRunningTime="2026-04-24 14:30:46.759470427 +0000 UTC m=+409.971321799" watchObservedRunningTime="2026-04-24 14:30:46.762335093 +0000 UTC m=+409.974186467" Apr 24 14:31:17.754466 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:17.754368 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-sdc5d" Apr 24 14:31:19.141318 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.141284 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-vqgzh"] Apr 24 14:31:19.144896 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.144877 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:19.146787 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.146767 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 14:31:19.147295 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.147281 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-qz5pf\"" Apr 24 14:31:19.154278 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.154253 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-vqgzh"] Apr 24 14:31:19.234905 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.234871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px88k\" (UniqueName: \"kubernetes.io/projected/792f6b82-6c33-4584-8dbe-1c85ac9dae57-kube-api-access-px88k\") pod \"kserve-controller-manager-b7dc77d59-vqgzh\" (UID: \"792f6b82-6c33-4584-8dbe-1c85ac9dae57\") " pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:19.235087 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.234930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/792f6b82-6c33-4584-8dbe-1c85ac9dae57-cert\") pod \"kserve-controller-manager-b7dc77d59-vqgzh\" (UID: \"792f6b82-6c33-4584-8dbe-1c85ac9dae57\") " pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:19.336175 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.336144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/792f6b82-6c33-4584-8dbe-1c85ac9dae57-cert\") pod \"kserve-controller-manager-b7dc77d59-vqgzh\" (UID: \"792f6b82-6c33-4584-8dbe-1c85ac9dae57\") " pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:19.336339 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.336212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px88k\" (UniqueName: \"kubernetes.io/projected/792f6b82-6c33-4584-8dbe-1c85ac9dae57-kube-api-access-px88k\") pod \"kserve-controller-manager-b7dc77d59-vqgzh\" (UID: \"792f6b82-6c33-4584-8dbe-1c85ac9dae57\") " pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:19.338609 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.338588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/792f6b82-6c33-4584-8dbe-1c85ac9dae57-cert\") pod \"kserve-controller-manager-b7dc77d59-vqgzh\" (UID: \"792f6b82-6c33-4584-8dbe-1c85ac9dae57\") " pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:19.344409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.344371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px88k\" (UniqueName: \"kubernetes.io/projected/792f6b82-6c33-4584-8dbe-1c85ac9dae57-kube-api-access-px88k\") pod \"kserve-controller-manager-b7dc77d59-vqgzh\" (UID: \"792f6b82-6c33-4584-8dbe-1c85ac9dae57\") " pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:19.456369 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.456294 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:19.576898 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.576876 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-vqgzh"] Apr 24 14:31:19.579613 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:31:19.579582 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792f6b82_6c33_4584_8dbe_1c85ac9dae57.slice/crio-7f09bfae2c5eb266868fdd6d4f4bffc5406c2f0dd0d5095a71e70d621a903cac WatchSource:0}: Error finding container 7f09bfae2c5eb266868fdd6d4f4bffc5406c2f0dd0d5095a71e70d621a903cac: Status 404 returned error can't find the container with id 7f09bfae2c5eb266868fdd6d4f4bffc5406c2f0dd0d5095a71e70d621a903cac Apr 24 14:31:19.856296 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:19.856260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" event={"ID":"792f6b82-6c33-4584-8dbe-1c85ac9dae57","Type":"ContainerStarted","Data":"7f09bfae2c5eb266868fdd6d4f4bffc5406c2f0dd0d5095a71e70d621a903cac"} Apr 24 14:31:22.870403 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:22.870354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" event={"ID":"792f6b82-6c33-4584-8dbe-1c85ac9dae57","Type":"ContainerStarted","Data":"66e82237c90656bcf9c8e1e11b82a081ee94882f6e804618ef1cc7dffb158e5a"} Apr 24 14:31:22.870846 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:22.870469 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:22.884943 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:22.884900 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" podStartSLOduration=1.4172279159999999 podStartE2EDuration="3.884888156s" podCreationTimestamp="2026-04-24 14:31:19 +0000 UTC" firstStartedPulling="2026-04-24 14:31:19.580978393 +0000 UTC m=+442.792829751" lastFinishedPulling="2026-04-24 14:31:22.048638638 +0000 UTC m=+445.260489991" observedRunningTime="2026-04-24 14:31:22.884358362 +0000 UTC m=+446.096209737" watchObservedRunningTime="2026-04-24 14:31:22.884888156 +0000 UTC m=+446.096739531" Apr 24 14:31:53.878526 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:53.878494 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b7dc77d59-vqgzh" Apr 24 14:31:54.657435 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.657383 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-phfpn"] Apr 24 14:31:54.664759 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.664737 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:54.667243 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.667223 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 14:31:54.667365 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.667231 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-4lrw8\"" Apr 24 14:31:54.668549 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.668500 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-phfpn"] Apr 24 14:31:54.837900 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.837863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/014c6264-9821-4227-a1a7-b8a5505a05e8-cert\") pod \"odh-model-controller-696fc77849-phfpn\" (UID: \"014c6264-9821-4227-a1a7-b8a5505a05e8\") " pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:54.838087 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.837928 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2md\" (UniqueName: \"kubernetes.io/projected/014c6264-9821-4227-a1a7-b8a5505a05e8-kube-api-access-rk2md\") pod \"odh-model-controller-696fc77849-phfpn\" (UID: \"014c6264-9821-4227-a1a7-b8a5505a05e8\") " pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:54.939295 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.939198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/014c6264-9821-4227-a1a7-b8a5505a05e8-cert\") pod \"odh-model-controller-696fc77849-phfpn\" (UID: \"014c6264-9821-4227-a1a7-b8a5505a05e8\") " pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:54.939295 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.939253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2md\" (UniqueName: \"kubernetes.io/projected/014c6264-9821-4227-a1a7-b8a5505a05e8-kube-api-access-rk2md\") pod \"odh-model-controller-696fc77849-phfpn\" (UID: \"014c6264-9821-4227-a1a7-b8a5505a05e8\") " pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:54.939819 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:31:54.939366 2574 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 14:31:54.939819 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:31:54.939490 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/014c6264-9821-4227-a1a7-b8a5505a05e8-cert podName:014c6264-9821-4227-a1a7-b8a5505a05e8 nodeName:}" failed. No retries permitted until 2026-04-24 14:31:55.439461036 +0000 UTC m=+478.651312392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/014c6264-9821-4227-a1a7-b8a5505a05e8-cert") pod "odh-model-controller-696fc77849-phfpn" (UID: "014c6264-9821-4227-a1a7-b8a5505a05e8") : secret "odh-model-controller-webhook-cert" not found Apr 24 14:31:54.947637 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:54.947614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2md\" (UniqueName: \"kubernetes.io/projected/014c6264-9821-4227-a1a7-b8a5505a05e8-kube-api-access-rk2md\") pod \"odh-model-controller-696fc77849-phfpn\" (UID: \"014c6264-9821-4227-a1a7-b8a5505a05e8\") " pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:55.444988 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:55.444942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/014c6264-9821-4227-a1a7-b8a5505a05e8-cert\") pod \"odh-model-controller-696fc77849-phfpn\" (UID: \"014c6264-9821-4227-a1a7-b8a5505a05e8\") " pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:55.447339 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:55.447318 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/014c6264-9821-4227-a1a7-b8a5505a05e8-cert\") pod \"odh-model-controller-696fc77849-phfpn\" (UID: \"014c6264-9821-4227-a1a7-b8a5505a05e8\") " pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:55.578093 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:55.578039 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:55.700664 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:55.700640 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-phfpn"] Apr 24 14:31:55.703244 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:31:55.703206 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod014c6264_9821_4227_a1a7_b8a5505a05e8.slice/crio-f58ac5bb3e8718c861eca29823edbec3850906cac949e22eb446f5f52ba9a123 WatchSource:0}: Error finding container f58ac5bb3e8718c861eca29823edbec3850906cac949e22eb446f5f52ba9a123: Status 404 returned error can't find the container with id f58ac5bb3e8718c861eca29823edbec3850906cac949e22eb446f5f52ba9a123 Apr 24 14:31:55.980857 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:55.980778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-phfpn" event={"ID":"014c6264-9821-4227-a1a7-b8a5505a05e8","Type":"ContainerStarted","Data":"f58ac5bb3e8718c861eca29823edbec3850906cac949e22eb446f5f52ba9a123"} Apr 24 14:31:57.767301 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.766518 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75fdf897bc-4t9gp"] Apr 24 14:31:57.774145 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.774115 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75fdf897bc-4t9gp"] Apr 24 14:31:57.774952 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.774783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.865752 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.865716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-config\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.865915 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.865826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-oauth-config\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.865915 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.865907 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-trusted-ca-bundle\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.866007 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.865938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7z8j\" (UniqueName: \"kubernetes.io/projected/ce5e9ad3-fbed-43af-a12c-82685ad45427-kube-api-access-x7z8j\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.866007 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.865978 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-oauth-serving-cert\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.866098 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.866010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-serving-cert\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.866098 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.866054 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-service-ca\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.966503 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.966459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-oauth-serving-cert\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.966695 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.966511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-serving-cert\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.966695 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.966549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-service-ca\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.966695 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.966583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-config\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.966695 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.966647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-oauth-config\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.966915 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.966718 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-trusted-ca-bundle\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.966915 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.966745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7z8j\" (UniqueName: \"kubernetes.io/projected/ce5e9ad3-fbed-43af-a12c-82685ad45427-kube-api-access-x7z8j\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.967296 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.967232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-oauth-serving-cert\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.967484 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.967438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-service-ca\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.967743 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.967722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-config\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.967841 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.967780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce5e9ad3-fbed-43af-a12c-82685ad45427-trusted-ca-bundle\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.969371 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.969350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-oauth-config\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.972236 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.972213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce5e9ad3-fbed-43af-a12c-82685ad45427-console-serving-cert\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:57.974828 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:57.974807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7z8j\" (UniqueName: \"kubernetes.io/projected/ce5e9ad3-fbed-43af-a12c-82685ad45427-kube-api-access-x7z8j\") pod \"console-75fdf897bc-4t9gp\" (UID: \"ce5e9ad3-fbed-43af-a12c-82685ad45427\") " pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:58.088224 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:58.088181 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:31:58.356722 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:58.356697 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75fdf897bc-4t9gp"] Apr 24 14:31:58.358983 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:31:58.358957 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce5e9ad3_fbed_43af_a12c_82685ad45427.slice/crio-8fa040efe777e35de4a105ce4bf111b85bec5db121f7cb3d5820174223477e1f WatchSource:0}: Error finding container 8fa040efe777e35de4a105ce4bf111b85bec5db121f7cb3d5820174223477e1f: Status 404 returned error can't find the container with id 8fa040efe777e35de4a105ce4bf111b85bec5db121f7cb3d5820174223477e1f Apr 24 14:31:58.993199 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:58.993162 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-phfpn" event={"ID":"014c6264-9821-4227-a1a7-b8a5505a05e8","Type":"ContainerStarted","Data":"b41101bd45aba18dd91ba774247e562604398fff644fa6fcce006f17cf9ed6db"} Apr 24 14:31:58.993650 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:58.993256 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:31:58.994565 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:58.994526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75fdf897bc-4t9gp" event={"ID":"ce5e9ad3-fbed-43af-a12c-82685ad45427","Type":"ContainerStarted","Data":"9c0f29acc5bf12fece6ef24a9e628c491f21f1e474aaa39ef5b1984dcd388273"} Apr 24 14:31:58.994565 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:58.994564 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75fdf897bc-4t9gp" event={"ID":"ce5e9ad3-fbed-43af-a12c-82685ad45427","Type":"ContainerStarted","Data":"8fa040efe777e35de4a105ce4bf111b85bec5db121f7cb3d5820174223477e1f"} Apr 24 14:31:59.011107 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:59.011061 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-phfpn" podStartSLOduration=2.427333721 podStartE2EDuration="5.011050063s" podCreationTimestamp="2026-04-24 14:31:54 +0000 UTC" firstStartedPulling="2026-04-24 14:31:55.704562328 +0000 UTC m=+478.916413684" lastFinishedPulling="2026-04-24 14:31:58.288278648 +0000 UTC m=+481.500130026" observedRunningTime="2026-04-24 14:31:59.008953225 +0000 UTC m=+482.220804600" watchObservedRunningTime="2026-04-24 14:31:59.011050063 +0000 UTC m=+482.222901504" Apr 24 14:31:59.026647 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:31:59.026608 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75fdf897bc-4t9gp" podStartSLOduration=2.026597721 podStartE2EDuration="2.026597721s" podCreationTimestamp="2026-04-24 14:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:31:59.024897945 +0000 UTC m=+482.236749321" watchObservedRunningTime="2026-04-24 14:31:59.026597721 +0000 UTC m=+482.238449096" Apr 24 14:32:08.088613 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:08.088581 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:32:08.088613 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:08.088619 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:32:08.092877 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:08.092857 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:32:09.033370 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:09.033339 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75fdf897bc-4t9gp" Apr 24 14:32:09.078799 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:09.078770 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ddf66748b-4gqzz"] Apr 24 14:32:10.000492 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:10.000463 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-phfpn" Apr 24 14:32:32.088499 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.088463 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd"] Apr 24 14:32:32.133234 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.133203 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd"] Apr 24 14:32:32.133379 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.133316 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:32:32.135347 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.135324 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-cnrls\"" Apr 24 14:32:32.239776 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.239744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2ba4c0b-5c77-46ef-943c-b0439d16fb97-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd\" (UID: \"b2ba4c0b-5c77-46ef-943c-b0439d16fb97\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:32:32.341220 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.341136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2ba4c0b-5c77-46ef-943c-b0439d16fb97-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd\" (UID: \"b2ba4c0b-5c77-46ef-943c-b0439d16fb97\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:32:32.341576 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.341552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2ba4c0b-5c77-46ef-943c-b0439d16fb97-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd\" (UID: \"b2ba4c0b-5c77-46ef-943c-b0439d16fb97\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:32:32.445031 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.445002 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:32:32.577833 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:32.577793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd"] Apr 24 14:32:32.581123 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:32:32.581090 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2ba4c0b_5c77_46ef_943c_b0439d16fb97.slice/crio-6f62fec93c86ad8420b1c90e4e44a3401fb1668f3c7243b00fd46ea7de60b892 WatchSource:0}: Error finding container 6f62fec93c86ad8420b1c90e4e44a3401fb1668f3c7243b00fd46ea7de60b892: Status 404 returned error can't find the container with id 6f62fec93c86ad8420b1c90e4e44a3401fb1668f3c7243b00fd46ea7de60b892 Apr 24 14:32:33.112062 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:33.112029 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" event={"ID":"b2ba4c0b-5c77-46ef-943c-b0439d16fb97","Type":"ContainerStarted","Data":"6f62fec93c86ad8420b1c90e4e44a3401fb1668f3c7243b00fd46ea7de60b892"} Apr 24 14:32:34.102829 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.102790 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-ddf66748b-4gqzz" podUID="c30c1dd8-3f26-4f33-a121-f2db2dad2baf" containerName="console" containerID="cri-o://dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a" gracePeriod=15 Apr 24 14:32:34.367470 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.367409 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ddf66748b-4gqzz_c30c1dd8-3f26-4f33-a121-f2db2dad2baf/console/0.log" Apr 24 14:32:34.367794 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.367473 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:32:34.459667 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.459630 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-serving-cert\") pod \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " Apr 24 14:32:34.459841 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.459682 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-trusted-ca-bundle\") pod \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " Apr 24 14:32:34.459841 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.459720 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-oauth-config\") pod \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " Apr 24 14:32:34.459841 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.459755 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-service-ca\") pod \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " Apr 24 14:32:34.459841 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.459827 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m77c8\" (UniqueName: \"kubernetes.io/projected/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-kube-api-access-m77c8\") pod \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " Apr 24 14:32:34.460045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.459864 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-config\") pod \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " Apr 24 14:32:34.460045 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.459909 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-oauth-serving-cert\") pod \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\" (UID: \"c30c1dd8-3f26-4f33-a121-f2db2dad2baf\") " Apr 24 14:32:34.460319 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.460283 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c30c1dd8-3f26-4f33-a121-f2db2dad2baf" (UID: "c30c1dd8-3f26-4f33-a121-f2db2dad2baf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:32:34.460510 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.460289 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-service-ca" (OuterVolumeSpecName: "service-ca") pod "c30c1dd8-3f26-4f33-a121-f2db2dad2baf" (UID: "c30c1dd8-3f26-4f33-a121-f2db2dad2baf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:32:34.460510 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.460491 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c30c1dd8-3f26-4f33-a121-f2db2dad2baf" (UID: "c30c1dd8-3f26-4f33-a121-f2db2dad2baf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:32:34.460644 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.460506 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-config" (OuterVolumeSpecName: "console-config") pod "c30c1dd8-3f26-4f33-a121-f2db2dad2baf" (UID: "c30c1dd8-3f26-4f33-a121-f2db2dad2baf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:32:34.462102 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.462073 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-kube-api-access-m77c8" (OuterVolumeSpecName: "kube-api-access-m77c8") pod "c30c1dd8-3f26-4f33-a121-f2db2dad2baf" (UID: "c30c1dd8-3f26-4f33-a121-f2db2dad2baf"). InnerVolumeSpecName "kube-api-access-m77c8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:32:34.462102 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.462087 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c30c1dd8-3f26-4f33-a121-f2db2dad2baf" (UID: "c30c1dd8-3f26-4f33-a121-f2db2dad2baf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:32:34.462233 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.462139 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c30c1dd8-3f26-4f33-a121-f2db2dad2baf" (UID: "c30c1dd8-3f26-4f33-a121-f2db2dad2baf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:32:34.560772 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.560735 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m77c8\" (UniqueName: \"kubernetes.io/projected/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-kube-api-access-m77c8\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:32:34.560772 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.560764 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:32:34.560772 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.560774 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-oauth-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:32:34.560989 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.560785 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-serving-cert\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:32:34.560989 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.560794 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-trusted-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:32:34.560989 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.560803 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-console-oauth-config\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:32:34.560989 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:34.560812 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c30c1dd8-3f26-4f33-a121-f2db2dad2baf-service-ca\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:32:35.122577 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.122542 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ddf66748b-4gqzz_c30c1dd8-3f26-4f33-a121-f2db2dad2baf/console/0.log" Apr 24 14:32:35.122763 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.122587 2574 generic.go:358] "Generic (PLEG): container finished" podID="c30c1dd8-3f26-4f33-a121-f2db2dad2baf" containerID="dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a" exitCode=2 Apr 24 14:32:35.122763 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.122640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ddf66748b-4gqzz" event={"ID":"c30c1dd8-3f26-4f33-a121-f2db2dad2baf","Type":"ContainerDied","Data":"dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a"} Apr 24 14:32:35.122763 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.122652 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ddf66748b-4gqzz" Apr 24 14:32:35.122763 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.122674 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ddf66748b-4gqzz" event={"ID":"c30c1dd8-3f26-4f33-a121-f2db2dad2baf","Type":"ContainerDied","Data":"92698f95121f0320cd662a46211d5489009a27475252c310897c1a2568334838"} Apr 24 14:32:35.122763 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.122694 2574 scope.go:117] "RemoveContainer" containerID="dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a" Apr 24 14:32:35.131095 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.131080 2574 scope.go:117] "RemoveContainer" containerID="dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a" Apr 24 14:32:35.131337 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:32:35.131317 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a\": container with ID starting with dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a not found: ID does not exist" containerID="dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a" Apr 24 14:32:35.131424 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.131349 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a"} err="failed to get container status \"dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a\": rpc error: code = NotFound desc = could not find container \"dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a\": container with ID starting with dec79f24515085b4cf4e9ccc7f1b63cdb15888209e2c9fac304c59c584cf1b6a not found: ID does not exist" Apr 24 14:32:35.142581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.142551 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ddf66748b-4gqzz"] Apr 24 14:32:35.146428 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.146387 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-ddf66748b-4gqzz"] Apr 24 14:32:35.360848 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:35.360817 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30c1dd8-3f26-4f33-a121-f2db2dad2baf" path="/var/lib/kubelet/pods/c30c1dd8-3f26-4f33-a121-f2db2dad2baf/volumes" Apr 24 14:32:37.140744 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:37.140700 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" event={"ID":"b2ba4c0b-5c77-46ef-943c-b0439d16fb97","Type":"ContainerStarted","Data":"08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5"} Apr 24 14:32:41.155546 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:41.155514 2574 generic.go:358] "Generic (PLEG): container finished" podID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerID="08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5" exitCode=0 Apr 24 14:32:41.155895 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:41.155595 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" event={"ID":"b2ba4c0b-5c77-46ef-943c-b0439d16fb97","Type":"ContainerDied","Data":"08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5"} Apr 24 14:32:55.211822 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:55.211781 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" event={"ID":"b2ba4c0b-5c77-46ef-943c-b0439d16fb97","Type":"ContainerStarted","Data":"4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef"} Apr 24 14:32:55.212201 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:55.212064 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:32:55.213119 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:55.213076 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:32:55.227278 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:55.227234 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podStartSLOduration=1.511382057 podStartE2EDuration="23.227221533s" podCreationTimestamp="2026-04-24 14:32:32 +0000 UTC" firstStartedPulling="2026-04-24 14:32:32.58303458 +0000 UTC m=+515.794885934" lastFinishedPulling="2026-04-24 14:32:54.29887405 +0000 UTC m=+537.510725410" observedRunningTime="2026-04-24 14:32:55.226167281 +0000 UTC m=+538.438018662" watchObservedRunningTime="2026-04-24 14:32:55.227221533 +0000 UTC m=+538.439072908" Apr 24 14:32:56.214976 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:32:56.214940 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:33:06.215527 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:33:06.215485 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:33:16.215676 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:33:16.215632 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:33:26.215078 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:33:26.215038 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:33:36.215360 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:33:36.215318 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:33:46.214995 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:33:46.214956 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:33:56.215597 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:33:56.215552 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 14:33:57.272122 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:33:57.272095 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:33:57.273905 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:33:57.273881 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:34:01.358832 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.358806 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:34:01.901325 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.901290 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9"] Apr 24 14:34:01.901692 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.901675 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c30c1dd8-3f26-4f33-a121-f2db2dad2baf" containerName="console" Apr 24 14:34:01.901692 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.901693 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30c1dd8-3f26-4f33-a121-f2db2dad2baf" containerName="console" Apr 24 14:34:01.901850 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.901765 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c30c1dd8-3f26-4f33-a121-f2db2dad2baf" containerName="console" Apr 24 14:34:01.904638 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.904620 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:01.906531 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.906509 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-f7779-kube-rbac-proxy-sar-config\"" Apr 24 14:34:01.906628 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.906600 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-f7779-serving-cert\"" Apr 24 14:34:01.906674 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.906643 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 14:34:01.914130 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:01.914101 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9"] Apr 24 14:34:02.072234 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.072194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-openshift-service-ca-bundle\") pod \"switch-graph-f7779-7d4b88fb5-rlzs9\" (UID: \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\") " pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:02.072436 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.072329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-proxy-tls\") pod \"switch-graph-f7779-7d4b88fb5-rlzs9\" (UID: \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\") " pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:02.172885 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.172801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-proxy-tls\") pod \"switch-graph-f7779-7d4b88fb5-rlzs9\" (UID: \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\") " pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:02.172885 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.172849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-openshift-service-ca-bundle\") pod \"switch-graph-f7779-7d4b88fb5-rlzs9\" (UID: \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\") " pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:02.173481 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.173452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-openshift-service-ca-bundle\") pod \"switch-graph-f7779-7d4b88fb5-rlzs9\" (UID: \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\") " pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:02.175247 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.175229 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-proxy-tls\") pod \"switch-graph-f7779-7d4b88fb5-rlzs9\" (UID: \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\") " pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:02.215846 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.215816 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:02.335796 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.335763 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9"] Apr 24 14:34:02.338951 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:34:02.338924 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5b7ae6c_0e26_4b00_8332_5f0bb85963d5.slice/crio-477e9be6d9b975f3e4ede6f5c2d3a19711063a2934f4e875f58c2b78cc5cb411 WatchSource:0}: Error finding container 477e9be6d9b975f3e4ede6f5c2d3a19711063a2934f4e875f58c2b78cc5cb411: Status 404 returned error can't find the container with id 477e9be6d9b975f3e4ede6f5c2d3a19711063a2934f4e875f58c2b78cc5cb411 Apr 24 14:34:02.450060 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:02.449970 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" event={"ID":"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5","Type":"ContainerStarted","Data":"477e9be6d9b975f3e4ede6f5c2d3a19711063a2934f4e875f58c2b78cc5cb411"} Apr 24 14:34:05.463753 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:05.463708 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" event={"ID":"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5","Type":"ContainerStarted","Data":"82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045"} Apr 24 14:34:05.464111 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:05.463766 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:05.480657 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:05.480564 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" podStartSLOduration=1.5940294609999999 podStartE2EDuration="4.480548935s" podCreationTimestamp="2026-04-24 14:34:01 +0000 UTC" firstStartedPulling="2026-04-24 14:34:02.340663697 +0000 UTC m=+605.552515050" lastFinishedPulling="2026-04-24 14:34:05.227183167 +0000 UTC m=+608.439034524" observedRunningTime="2026-04-24 14:34:05.479138723 +0000 UTC m=+608.690990080" watchObservedRunningTime="2026-04-24 14:34:05.480548935 +0000 UTC m=+608.692400346" Apr 24 14:34:11.473386 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:11.473295 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:12.063159 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:12.063122 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9"] Apr 24 14:34:12.063369 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:12.063338 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" containerID="cri-o://82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045" gracePeriod=30 Apr 24 14:34:16.472176 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:16.472095 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:21.471239 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:21.471196 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:26.471820 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:26.471784 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:26.472278 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:26.471906 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:31.472471 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:31.472419 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:36.472191 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:36.472145 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:41.471959 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:41.471923 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:41.937633 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:41.937596 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-577dc58474-plfqd"] Apr 24 14:34:41.942525 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:41.942502 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:41.944427 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:41.944388 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 24 14:34:41.944549 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:41.944411 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 24 14:34:41.951797 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:41.951771 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-577dc58474-plfqd"] Apr 24 14:34:41.992643 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:41.992605 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28aa4eb4-9a9f-4ff3-b891-d7349362a921-openshift-service-ca-bundle\") pod \"model-chainer-577dc58474-plfqd\" (UID: \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\") " pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:41.992794 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:41.992715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28aa4eb4-9a9f-4ff3-b891-d7349362a921-proxy-tls\") pod \"model-chainer-577dc58474-plfqd\" (UID: \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\") " pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:42.094245 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.094205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28aa4eb4-9a9f-4ff3-b891-d7349362a921-openshift-service-ca-bundle\") pod \"model-chainer-577dc58474-plfqd\" (UID: \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\") " pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:42.094413 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.094298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28aa4eb4-9a9f-4ff3-b891-d7349362a921-proxy-tls\") pod \"model-chainer-577dc58474-plfqd\" (UID: \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\") " pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:42.094972 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.094947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28aa4eb4-9a9f-4ff3-b891-d7349362a921-openshift-service-ca-bundle\") pod \"model-chainer-577dc58474-plfqd\" (UID: \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\") " pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:42.096919 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.096894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28aa4eb4-9a9f-4ff3-b891-d7349362a921-proxy-tls\") pod \"model-chainer-577dc58474-plfqd\" (UID: \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\") " pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:42.209818 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.209796 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:42.253751 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.253722 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:42.296735 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.296700 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-openshift-service-ca-bundle\") pod \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\" (UID: \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\") " Apr 24 14:34:42.297096 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.296797 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-proxy-tls\") pod \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\" (UID: \"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5\") " Apr 24 14:34:42.297096 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.297039 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" (UID: "e5b7ae6c-0e26-4b00-8332-5f0bb85963d5"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:34:42.299025 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.298998 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" (UID: "e5b7ae6c-0e26-4b00-8332-5f0bb85963d5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:34:42.374725 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.374644 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-577dc58474-plfqd"] Apr 24 14:34:42.376995 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:34:42.376964 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28aa4eb4_9a9f_4ff3_b891_d7349362a921.slice/crio-45c1e8daa92abc851fbf32232c5989de8a32d3ef6d885a912f1338cf48e46f3d WatchSource:0}: Error finding container 45c1e8daa92abc851fbf32232c5989de8a32d3ef6d885a912f1338cf48e46f3d: Status 404 returned error can't find the container with id 45c1e8daa92abc851fbf32232c5989de8a32d3ef6d885a912f1338cf48e46f3d Apr 24 14:34:42.378684 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.378668 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:34:42.397547 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.397522 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:34:42.397627 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.397548 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:34:42.585369 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.585340 2574 generic.go:358] "Generic (PLEG): container finished" podID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerID="82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045" exitCode=0 Apr 24 14:34:42.585835 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.585421 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" Apr 24 14:34:42.585835 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.585440 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" event={"ID":"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5","Type":"ContainerDied","Data":"82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045"} Apr 24 14:34:42.585835 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.585484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9" event={"ID":"e5b7ae6c-0e26-4b00-8332-5f0bb85963d5","Type":"ContainerDied","Data":"477e9be6d9b975f3e4ede6f5c2d3a19711063a2934f4e875f58c2b78cc5cb411"} Apr 24 14:34:42.585835 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.585500 2574 scope.go:117] "RemoveContainer" containerID="82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045" Apr 24 14:34:42.586965 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.586939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" event={"ID":"28aa4eb4-9a9f-4ff3-b891-d7349362a921","Type":"ContainerStarted","Data":"53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190"} Apr 24 14:34:42.587062 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.586983 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:42.587062 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.586998 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" event={"ID":"28aa4eb4-9a9f-4ff3-b891-d7349362a921","Type":"ContainerStarted","Data":"45c1e8daa92abc851fbf32232c5989de8a32d3ef6d885a912f1338cf48e46f3d"} Apr 24 14:34:42.594105 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.594087 2574 scope.go:117] "RemoveContainer" containerID="82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045" Apr 24 14:34:42.594348 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:34:42.594330 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045\": container with ID starting with 82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045 not found: ID does not exist" containerID="82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045" Apr 24 14:34:42.594420 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.594359 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045"} err="failed to get container status \"82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045\": rpc error: code = NotFound desc = could not find container \"82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045\": container with ID starting with 82985007f586054502e07e6fc911eedf616ff442d5009cf04a87d78161687045 not found: ID does not exist" Apr 24 14:34:42.603209 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.603170 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" podStartSLOduration=1.6031331770000001 podStartE2EDuration="1.603133177s" podCreationTimestamp="2026-04-24 14:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:34:42.601434866 +0000 UTC m=+645.813286242" watchObservedRunningTime="2026-04-24 14:34:42.603133177 +0000 UTC m=+645.814984552" Apr 24 14:34:42.616855 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.616825 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9"] Apr 24 14:34:42.617033 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:42.617021 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f7779-7d4b88fb5-rlzs9"] Apr 24 14:34:43.359287 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:43.359208 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" path="/var/lib/kubelet/pods/e5b7ae6c-0e26-4b00-8332-5f0bb85963d5/volumes" Apr 24 14:34:48.598076 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:48.598045 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:34:52.029577 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:52.029542 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-577dc58474-plfqd"] Apr 24 14:34:52.029981 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:52.029740 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" containerID="cri-o://53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190" gracePeriod=30 Apr 24 14:34:52.188674 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:52.188639 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd"] Apr 24 14:34:52.188946 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:52.188924 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" containerID="cri-o://4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef" gracePeriod=30 Apr 24 14:34:53.596535 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:53.596497 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:34:56.636634 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.636609 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:34:56.639037 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.639009 2574 generic.go:358] "Generic (PLEG): container finished" podID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerID="4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef" exitCode=0 Apr 24 14:34:56.639144 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.639067 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" Apr 24 14:34:56.639144 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.639091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" event={"ID":"b2ba4c0b-5c77-46ef-943c-b0439d16fb97","Type":"ContainerDied","Data":"4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef"} Apr 24 14:34:56.639144 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.639130 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd" event={"ID":"b2ba4c0b-5c77-46ef-943c-b0439d16fb97","Type":"ContainerDied","Data":"6f62fec93c86ad8420b1c90e4e44a3401fb1668f3c7243b00fd46ea7de60b892"} Apr 24 14:34:56.639283 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.639151 2574 scope.go:117] "RemoveContainer" containerID="4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef" Apr 24 14:34:56.649138 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.649111 2574 scope.go:117] "RemoveContainer" containerID="08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5" Apr 24 14:34:56.657255 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.657230 2574 scope.go:117] "RemoveContainer" containerID="4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef" Apr 24 14:34:56.657514 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:34:56.657493 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef\": container with ID starting with 4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef not found: ID does not exist" containerID="4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef" Apr 24 14:34:56.657585 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.657523 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef"} err="failed to get container status \"4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef\": rpc error: code = NotFound desc = could not find container \"4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef\": container with ID starting with 4fbfb674752d81754a802ea92838b0e041acfe6e6b0fbb8ef4d2b78713917fef not found: ID does not exist" Apr 24 14:34:56.657585 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.657542 2574 scope.go:117] "RemoveContainer" containerID="08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5" Apr 24 14:34:56.657829 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:34:56.657812 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5\": container with ID starting with 08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5 not found: ID does not exist" containerID="08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5" Apr 24 14:34:56.657876 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.657837 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5"} err="failed to get container status \"08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5\": rpc error: code = NotFound desc = could not find container \"08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5\": container with ID starting with 08a37308093d791d67bd273db12e413e8f8b47cfb73a3f82bd31943288a919d5 not found: ID does not exist" Apr 24 14:34:56.719574 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.719488 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2ba4c0b-5c77-46ef-943c-b0439d16fb97-kserve-provision-location\") pod \"b2ba4c0b-5c77-46ef-943c-b0439d16fb97\" (UID: \"b2ba4c0b-5c77-46ef-943c-b0439d16fb97\") " Apr 24 14:34:56.719865 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.719842 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ba4c0b-5c77-46ef-943c-b0439d16fb97-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b2ba4c0b-5c77-46ef-943c-b0439d16fb97" (UID: "b2ba4c0b-5c77-46ef-943c-b0439d16fb97"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:34:56.821056 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.821019 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2ba4c0b-5c77-46ef-943c-b0439d16fb97-kserve-provision-location\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:34:56.959607 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.959577 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd"] Apr 24 14:34:56.963006 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:56.962979 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7cfc94c67b-rdvnd"] Apr 24 14:34:57.358772 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:57.358741 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" path="/var/lib/kubelet/pods/b2ba4c0b-5c77-46ef-943c-b0439d16fb97/volumes" Apr 24 14:34:58.596409 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:34:58.596361 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:03.596283 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:03.596240 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:03.596680 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:03.596348 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:35:08.596513 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:08.596473 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:13.595956 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:13.595910 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:18.595838 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:18.595796 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:35:22.059793 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:35:22.059757 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28aa4eb4_9a9f_4ff3_b891_d7349362a921.slice/crio-conmon-53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190.scope\": RecentStats: unable to find data in memory cache]" Apr 24 14:35:22.060754 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:35:22.059831 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28aa4eb4_9a9f_4ff3_b891_d7349362a921.slice/crio-conmon-53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190.scope\": RecentStats: unable to find data in memory cache]" Apr 24 14:35:22.179054 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.179031 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:35:22.233445 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.233411 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28aa4eb4-9a9f-4ff3-b891-d7349362a921-proxy-tls\") pod \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\" (UID: \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\") " Apr 24 14:35:22.233587 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.233464 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28aa4eb4-9a9f-4ff3-b891-d7349362a921-openshift-service-ca-bundle\") pod \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\" (UID: \"28aa4eb4-9a9f-4ff3-b891-d7349362a921\") " Apr 24 14:35:22.233870 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.233844 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28aa4eb4-9a9f-4ff3-b891-d7349362a921-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "28aa4eb4-9a9f-4ff3-b891-d7349362a921" (UID: "28aa4eb4-9a9f-4ff3-b891-d7349362a921"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:35:22.235603 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.235572 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28aa4eb4-9a9f-4ff3-b891-d7349362a921-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "28aa4eb4-9a9f-4ff3-b891-d7349362a921" (UID: "28aa4eb4-9a9f-4ff3-b891-d7349362a921"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:35:22.311831 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.311804 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82"] Apr 24 14:35:22.312155 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312144 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" Apr 24 14:35:22.312208 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312157 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" Apr 24 14:35:22.312208 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312167 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" Apr 24 14:35:22.312208 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312172 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" Apr 24 14:35:22.312208 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312188 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="storage-initializer" Apr 24 14:35:22.312208 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312194 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="storage-initializer" Apr 24 14:35:22.312208 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312207 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" Apr 24 14:35:22.312383 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312213 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" Apr 24 14:35:22.312383 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312260 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2ba4c0b-5c77-46ef-943c-b0439d16fb97" containerName="kserve-container" Apr 24 14:35:22.312383 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312271 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5b7ae6c-0e26-4b00-8332-5f0bb85963d5" containerName="switch-graph-f7779" Apr 24 14:35:22.312383 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.312279 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerName="model-chainer" Apr 24 14:35:22.315442 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.315422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:22.317415 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.317383 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-96b39-kube-rbac-proxy-sar-config\"" Apr 24 14:35:22.317511 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.317423 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-96b39-serving-cert\"" Apr 24 14:35:22.323271 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.323245 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82"] Apr 24 14:35:22.335020 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.334996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/540c364b-d84c-46e4-b95b-9f454a7072aa-openshift-service-ca-bundle\") pod \"switch-graph-96b39-559b467444-k8w82\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:22.335124 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.335106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls\") pod \"switch-graph-96b39-559b467444-k8w82\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:22.335190 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.335178 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28aa4eb4-9a9f-4ff3-b891-d7349362a921-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:35:22.335233 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.335198 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28aa4eb4-9a9f-4ff3-b891-d7349362a921-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:35:22.435614 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.435578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls\") pod \"switch-graph-96b39-559b467444-k8w82\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:22.435803 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.435641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/540c364b-d84c-46e4-b95b-9f454a7072aa-openshift-service-ca-bundle\") pod \"switch-graph-96b39-559b467444-k8w82\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:22.435803 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:35:22.435714 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-96b39-serving-cert: secret "switch-graph-96b39-serving-cert" not found Apr 24 14:35:22.435803 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:35:22.435780 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls podName:540c364b-d84c-46e4-b95b-9f454a7072aa nodeName:}" failed. No retries permitted until 2026-04-24 14:35:22.935760982 +0000 UTC m=+686.147612335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls") pod "switch-graph-96b39-559b467444-k8w82" (UID: "540c364b-d84c-46e4-b95b-9f454a7072aa") : secret "switch-graph-96b39-serving-cert" not found Apr 24 14:35:22.436296 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.436276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/540c364b-d84c-46e4-b95b-9f454a7072aa-openshift-service-ca-bundle\") pod \"switch-graph-96b39-559b467444-k8w82\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:22.724510 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.724420 2574 generic.go:358] "Generic (PLEG): container finished" podID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" containerID="53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190" exitCode=0 Apr 24 14:35:22.724510 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.724479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" event={"ID":"28aa4eb4-9a9f-4ff3-b891-d7349362a921","Type":"ContainerDied","Data":"53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190"} Apr 24 14:35:22.724701 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.724521 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" event={"ID":"28aa4eb4-9a9f-4ff3-b891-d7349362a921","Type":"ContainerDied","Data":"45c1e8daa92abc851fbf32232c5989de8a32d3ef6d885a912f1338cf48e46f3d"} Apr 24 14:35:22.724701 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.724533 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-577dc58474-plfqd" Apr 24 14:35:22.724701 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.724537 2574 scope.go:117] "RemoveContainer" containerID="53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190" Apr 24 14:35:22.732945 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.732927 2574 scope.go:117] "RemoveContainer" containerID="53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190" Apr 24 14:35:22.733209 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:35:22.733188 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190\": container with ID starting with 53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190 not found: ID does not exist" containerID="53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190" Apr 24 14:35:22.733263 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.733216 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190"} err="failed to get container status \"53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190\": rpc error: code = NotFound desc = could not find container \"53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190\": container with ID starting with 53ce4fdff474c4441805e62d57f264fbb51d75f1dd64ff181fd18d75e9d8e190 not found: ID does not exist" Apr 24 14:35:22.744376 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.744347 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-577dc58474-plfqd"] Apr 24 14:35:22.748294 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.748274 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-577dc58474-plfqd"] Apr 24 14:35:22.939812 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.939773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls\") pod \"switch-graph-96b39-559b467444-k8w82\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:22.942202 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:22.942179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls\") pod \"switch-graph-96b39-559b467444-k8w82\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:23.225832 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:23.225800 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:23.345519 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:23.345443 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82"] Apr 24 14:35:23.348044 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:35:23.348015 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod540c364b_d84c_46e4_b95b_9f454a7072aa.slice/crio-d16dbb94b7252332d14215c5b1f919ed4cb29db0089748e1366e25cba926e89f WatchSource:0}: Error finding container d16dbb94b7252332d14215c5b1f919ed4cb29db0089748e1366e25cba926e89f: Status 404 returned error can't find the container with id d16dbb94b7252332d14215c5b1f919ed4cb29db0089748e1366e25cba926e89f Apr 24 14:35:23.362957 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:23.362933 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28aa4eb4-9a9f-4ff3-b891-d7349362a921" path="/var/lib/kubelet/pods/28aa4eb4-9a9f-4ff3-b891-d7349362a921/volumes" Apr 24 14:35:23.729556 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:23.729522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" event={"ID":"540c364b-d84c-46e4-b95b-9f454a7072aa","Type":"ContainerStarted","Data":"599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22"} Apr 24 14:35:23.729556 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:23.729557 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" event={"ID":"540c364b-d84c-46e4-b95b-9f454a7072aa","Type":"ContainerStarted","Data":"d16dbb94b7252332d14215c5b1f919ed4cb29db0089748e1366e25cba926e89f"} Apr 24 14:35:23.729799 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:23.729645 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:35:23.744739 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:23.744699 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" podStartSLOduration=1.7446857489999998 podStartE2EDuration="1.744685749s" podCreationTimestamp="2026-04-24 14:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:35:23.743391579 +0000 UTC m=+686.955242953" watchObservedRunningTime="2026-04-24 14:35:23.744685749 +0000 UTC m=+686.956537124" Apr 24 14:35:29.738810 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:35:29.738783 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:36:02.196325 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.196294 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr"] Apr 24 14:36:02.200900 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.200872 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:02.203007 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.202985 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-71827-kube-rbac-proxy-sar-config\"" Apr 24 14:36:02.203116 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.203043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-71827-serving-cert\"" Apr 24 14:36:02.210057 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.210034 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr"] Apr 24 14:36:02.370891 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.370852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/886c7da9-fb4d-422f-ab80-c6ed7359528c-openshift-service-ca-bundle\") pod \"sequence-graph-71827-6d595bf6fb-cdhrr\" (UID: \"886c7da9-fb4d-422f-ab80-c6ed7359528c\") " pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:02.370891 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.370902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/886c7da9-fb4d-422f-ab80-c6ed7359528c-proxy-tls\") pod \"sequence-graph-71827-6d595bf6fb-cdhrr\" (UID: \"886c7da9-fb4d-422f-ab80-c6ed7359528c\") " pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:02.471912 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.471820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/886c7da9-fb4d-422f-ab80-c6ed7359528c-openshift-service-ca-bundle\") pod \"sequence-graph-71827-6d595bf6fb-cdhrr\" (UID: \"886c7da9-fb4d-422f-ab80-c6ed7359528c\") " pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:02.471912 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.471864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/886c7da9-fb4d-422f-ab80-c6ed7359528c-proxy-tls\") pod \"sequence-graph-71827-6d595bf6fb-cdhrr\" (UID: \"886c7da9-fb4d-422f-ab80-c6ed7359528c\") " pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:02.472488 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.472458 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/886c7da9-fb4d-422f-ab80-c6ed7359528c-openshift-service-ca-bundle\") pod \"sequence-graph-71827-6d595bf6fb-cdhrr\" (UID: \"886c7da9-fb4d-422f-ab80-c6ed7359528c\") " pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:02.474428 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.474391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/886c7da9-fb4d-422f-ab80-c6ed7359528c-proxy-tls\") pod \"sequence-graph-71827-6d595bf6fb-cdhrr\" (UID: \"886c7da9-fb4d-422f-ab80-c6ed7359528c\") " pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:02.512084 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.512053 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:02.836661 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.836543 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr"] Apr 24 14:36:02.839506 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:36:02.839479 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886c7da9_fb4d_422f_ab80_c6ed7359528c.slice/crio-44aa1dd647b7e3b8887663c1962df7e2842e2a156377cf258174de257bd0c0c4 WatchSource:0}: Error finding container 44aa1dd647b7e3b8887663c1962df7e2842e2a156377cf258174de257bd0c0c4: Status 404 returned error can't find the container with id 44aa1dd647b7e3b8887663c1962df7e2842e2a156377cf258174de257bd0c0c4 Apr 24 14:36:02.862327 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:02.862290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" event={"ID":"886c7da9-fb4d-422f-ab80-c6ed7359528c","Type":"ContainerStarted","Data":"44aa1dd647b7e3b8887663c1962df7e2842e2a156377cf258174de257bd0c0c4"} Apr 24 14:36:03.866650 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:03.866610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" event={"ID":"886c7da9-fb4d-422f-ab80-c6ed7359528c","Type":"ContainerStarted","Data":"e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc"} Apr 24 14:36:03.867134 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:03.866738 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:36:03.883071 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:03.883008 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" podStartSLOduration=1.8829914429999999 podStartE2EDuration="1.882991443s" podCreationTimestamp="2026-04-24 14:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:36:03.880938011 +0000 UTC m=+727.092789398" watchObservedRunningTime="2026-04-24 14:36:03.882991443 +0000 UTC m=+727.094842820" Apr 24 14:36:09.875089 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:36:09.875060 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:38:57.295708 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:38:57.295636 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:38:57.298645 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:38:57.298622 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:43:36.874099 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:36.874064 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82"] Apr 24 14:43:36.874630 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:36.874384 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" containerID="cri-o://599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22" gracePeriod=30 Apr 24 14:43:39.737300 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:39.737257 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:43:44.737699 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:44.737661 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:43:49.737350 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:49.737307 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:43:49.737886 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:49.737434 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:43:54.737324 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:54.737273 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:43:57.320107 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:57.320080 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:43:57.324542 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:57.324515 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:43:59.737493 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:43:59.737457 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:44:04.737909 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:04.737866 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:44:07.022908 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.022884 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:44:07.067558 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.067529 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/540c364b-d84c-46e4-b95b-9f454a7072aa-openshift-service-ca-bundle\") pod \"540c364b-d84c-46e4-b95b-9f454a7072aa\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " Apr 24 14:44:07.067672 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.067586 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls\") pod \"540c364b-d84c-46e4-b95b-9f454a7072aa\" (UID: \"540c364b-d84c-46e4-b95b-9f454a7072aa\") " Apr 24 14:44:07.067862 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.067841 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/540c364b-d84c-46e4-b95b-9f454a7072aa-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "540c364b-d84c-46e4-b95b-9f454a7072aa" (UID: "540c364b-d84c-46e4-b95b-9f454a7072aa"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:44:07.069539 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.069516 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "540c364b-d84c-46e4-b95b-9f454a7072aa" (UID: "540c364b-d84c-46e4-b95b-9f454a7072aa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:44:07.168812 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.168758 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/540c364b-d84c-46e4-b95b-9f454a7072aa-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:44:07.168812 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.168780 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/540c364b-d84c-46e4-b95b-9f454a7072aa-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:44:07.478000 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.477908 2574 generic.go:358] "Generic (PLEG): container finished" podID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerID="599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22" exitCode=0 Apr 24 14:44:07.478000 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.477983 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" event={"ID":"540c364b-d84c-46e4-b95b-9f454a7072aa","Type":"ContainerDied","Data":"599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22"} Apr 24 14:44:07.478217 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.478001 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" Apr 24 14:44:07.478217 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.478020 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82" event={"ID":"540c364b-d84c-46e4-b95b-9f454a7072aa","Type":"ContainerDied","Data":"d16dbb94b7252332d14215c5b1f919ed4cb29db0089748e1366e25cba926e89f"} Apr 24 14:44:07.478217 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.478042 2574 scope.go:117] "RemoveContainer" containerID="599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22" Apr 24 14:44:07.486206 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.486188 2574 scope.go:117] "RemoveContainer" containerID="599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22" Apr 24 14:44:07.486458 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:44:07.486436 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22\": container with ID starting with 599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22 not found: ID does not exist" containerID="599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22" Apr 24 14:44:07.486558 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.486464 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22"} err="failed to get container status \"599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22\": rpc error: code = NotFound desc = could not find container \"599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22\": container with ID starting with 599876595c226285b486e97ead1f623160f2d0c626c9dbbaa95f3425913f1c22 not found: ID does not exist" Apr 24 14:44:07.492426 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.492380 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82"] Apr 24 14:44:07.495727 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:07.495708 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-96b39-559b467444-k8w82"] Apr 24 14:44:09.359105 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:09.359069 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" path="/var/lib/kubelet/pods/540c364b-d84c-46e4-b95b-9f454a7072aa/volumes" Apr 24 14:44:16.891128 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:16.891092 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr"] Apr 24 14:44:16.891605 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:16.891410 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" containerID="cri-o://e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc" gracePeriod=30 Apr 24 14:44:19.873737 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:19.873696 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:44:24.874478 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:24.874432 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:44:29.874360 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:29.874318 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:44:29.874757 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:29.874445 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:44:34.873948 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:34.873911 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:44:39.874180 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:39.874133 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:44:44.873594 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:44.873554 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:44:47.079769 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.079736 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8"] Apr 24 14:44:47.080147 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.080100 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" Apr 24 14:44:47.080147 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.080110 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" Apr 24 14:44:47.080220 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.080163 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="540c364b-d84c-46e4-b95b-9f454a7072aa" containerName="switch-graph-96b39" Apr 24 14:44:47.083328 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.083309 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:47.085354 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.085329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-0cfc3-serving-cert\"" Apr 24 14:44:47.085461 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.085358 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-0cfc3-kube-rbac-proxy-sar-config\"" Apr 24 14:44:47.090056 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.090032 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8"] Apr 24 14:44:47.190844 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.190763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-openshift-service-ca-bundle\") pod \"ensemble-graph-0cfc3-6c948f8f4d-9c9p8\" (UID: \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\") " pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:47.190844 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.190806 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-proxy-tls\") pod \"ensemble-graph-0cfc3-6c948f8f4d-9c9p8\" (UID: \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\") " pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:47.291836 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.291802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-openshift-service-ca-bundle\") pod \"ensemble-graph-0cfc3-6c948f8f4d-9c9p8\" (UID: \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\") " pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:47.292076 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.291858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-proxy-tls\") pod \"ensemble-graph-0cfc3-6c948f8f4d-9c9p8\" (UID: \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\") " pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:47.292516 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.292493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-openshift-service-ca-bundle\") pod \"ensemble-graph-0cfc3-6c948f8f4d-9c9p8\" (UID: \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\") " pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:47.294194 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.294176 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-proxy-tls\") pod \"ensemble-graph-0cfc3-6c948f8f4d-9c9p8\" (UID: \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\") " pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:47.393934 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.393901 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:47.539709 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.539683 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8"] Apr 24 14:44:47.542065 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:44:47.542038 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5d7317_a0a4_49fd_80e6_3ba1a237fc41.slice/crio-b077d9353c5dd8e236f977294f403b27a91fdff40913b06e83c6af8f86099f75 WatchSource:0}: Error finding container b077d9353c5dd8e236f977294f403b27a91fdff40913b06e83c6af8f86099f75: Status 404 returned error can't find the container with id b077d9353c5dd8e236f977294f403b27a91fdff40913b06e83c6af8f86099f75 Apr 24 14:44:47.543820 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.543803 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:44:47.551121 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.551105 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:44:47.612429 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.612375 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" event={"ID":"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41","Type":"ContainerStarted","Data":"b077d9353c5dd8e236f977294f403b27a91fdff40913b06e83c6af8f86099f75"} Apr 24 14:44:47.613492 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.613466 2574 generic.go:358] "Generic (PLEG): container finished" podID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerID="e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc" exitCode=0 Apr 24 14:44:47.613597 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.613582 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" event={"ID":"886c7da9-fb4d-422f-ab80-c6ed7359528c","Type":"ContainerDied","Data":"e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc"} Apr 24 14:44:47.613664 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.613592 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" Apr 24 14:44:47.613664 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.613616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr" event={"ID":"886c7da9-fb4d-422f-ab80-c6ed7359528c","Type":"ContainerDied","Data":"44aa1dd647b7e3b8887663c1962df7e2842e2a156377cf258174de257bd0c0c4"} Apr 24 14:44:47.613664 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.613637 2574 scope.go:117] "RemoveContainer" containerID="e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc" Apr 24 14:44:47.623124 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.623097 2574 scope.go:117] "RemoveContainer" containerID="e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc" Apr 24 14:44:47.623361 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:44:47.623338 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc\": container with ID starting with e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc not found: ID does not exist" containerID="e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc" Apr 24 14:44:47.623468 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.623370 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc"} err="failed to get container status \"e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc\": rpc error: code = NotFound desc = could not find container \"e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc\": container with ID starting with e9af46313b4eaa2e5686d81c7e4503c78aa64edb67e453f6862fb74d108e86dc not found: ID does not exist" Apr 24 14:44:47.694560 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.694519 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/886c7da9-fb4d-422f-ab80-c6ed7359528c-openshift-service-ca-bundle\") pod \"886c7da9-fb4d-422f-ab80-c6ed7359528c\" (UID: \"886c7da9-fb4d-422f-ab80-c6ed7359528c\") " Apr 24 14:44:47.694736 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.694573 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/886c7da9-fb4d-422f-ab80-c6ed7359528c-proxy-tls\") pod \"886c7da9-fb4d-422f-ab80-c6ed7359528c\" (UID: \"886c7da9-fb4d-422f-ab80-c6ed7359528c\") " Apr 24 14:44:47.694955 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.694927 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/886c7da9-fb4d-422f-ab80-c6ed7359528c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "886c7da9-fb4d-422f-ab80-c6ed7359528c" (UID: "886c7da9-fb4d-422f-ab80-c6ed7359528c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:44:47.696651 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.696602 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886c7da9-fb4d-422f-ab80-c6ed7359528c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "886c7da9-fb4d-422f-ab80-c6ed7359528c" (UID: "886c7da9-fb4d-422f-ab80-c6ed7359528c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:44:47.795287 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.795248 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/886c7da9-fb4d-422f-ab80-c6ed7359528c-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:44:47.795287 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.795275 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/886c7da9-fb4d-422f-ab80-c6ed7359528c-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:44:47.933768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.933742 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr"] Apr 24 14:44:47.936837 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:47.936811 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-71827-6d595bf6fb-cdhrr"] Apr 24 14:44:48.618381 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:48.618333 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" event={"ID":"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41","Type":"ContainerStarted","Data":"ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb"} Apr 24 14:44:48.633951 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:48.633890 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" podStartSLOduration=1.633875319 podStartE2EDuration="1.633875319s" podCreationTimestamp="2026-04-24 14:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:44:48.632554533 +0000 UTC m=+1251.844405908" watchObservedRunningTime="2026-04-24 14:44:48.633875319 +0000 UTC m=+1251.845726694" Apr 24 14:44:49.359127 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:49.359093 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" path="/var/lib/kubelet/pods/886c7da9-fb4d-422f-ab80-c6ed7359528c/volumes" Apr 24 14:44:49.623442 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:49.623347 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:55.637120 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:55.637092 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:44:57.138019 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:57.137978 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8"] Apr 24 14:44:57.138502 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:44:57.138214 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" containerID="cri-o://ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb" gracePeriod=30 Apr 24 14:45:00.635194 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:00.635157 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:05.634171 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:05.634131 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:10.634493 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:10.634447 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:10.634864 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:10.634590 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:45:15.634812 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:15.634778 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:20.634990 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:20.634948 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:25.634582 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:25.634548 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:27.052112 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.052075 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh"] Apr 24 14:45:27.052571 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.052476 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" Apr 24 14:45:27.052571 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.052488 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" Apr 24 14:45:27.052571 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.052548 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="886c7da9-fb4d-422f-ab80-c6ed7359528c" containerName="sequence-graph-71827" Apr 24 14:45:27.055358 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.055342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:27.057414 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.057378 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-5e366-kube-rbac-proxy-sar-config\"" Apr 24 14:45:27.057523 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.057439 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-5e366-serving-cert\"" Apr 24 14:45:27.063512 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.063492 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh"] Apr 24 14:45:27.108909 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.108880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4a2af3b-8106-4f21-8808-342a33e535f6-openshift-service-ca-bundle\") pod \"sequence-graph-5e366-554dbc9dc6-g4svh\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:27.109072 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.108928 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls\") pod \"sequence-graph-5e366-554dbc9dc6-g4svh\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:27.165849 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:45:27.165816 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5d7317_a0a4_49fd_80e6_3ba1a237fc41.slice/crio-conmon-ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5d7317_a0a4_49fd_80e6_3ba1a237fc41.slice/crio-ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb.scope\": RecentStats: unable to find data in memory cache]" Apr 24 14:45:27.166005 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:45:27.165846 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5d7317_a0a4_49fd_80e6_3ba1a237fc41.slice/crio-ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb.scope\": RecentStats: unable to find data in memory cache]" Apr 24 14:45:27.209992 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.209961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4a2af3b-8106-4f21-8808-342a33e535f6-openshift-service-ca-bundle\") pod \"sequence-graph-5e366-554dbc9dc6-g4svh\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:27.210137 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.210020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls\") pod \"sequence-graph-5e366-554dbc9dc6-g4svh\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:27.210223 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:45:27.210203 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-5e366-serving-cert: secret "sequence-graph-5e366-serving-cert" not found Apr 24 14:45:27.210295 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:45:27.210283 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls podName:b4a2af3b-8106-4f21-8808-342a33e535f6 nodeName:}" failed. No retries permitted until 2026-04-24 14:45:27.710259825 +0000 UTC m=+1290.922111189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls") pod "sequence-graph-5e366-554dbc9dc6-g4svh" (UID: "b4a2af3b-8106-4f21-8808-342a33e535f6") : secret "sequence-graph-5e366-serving-cert" not found Apr 24 14:45:27.210668 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.210648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4a2af3b-8106-4f21-8808-342a33e535f6-openshift-service-ca-bundle\") pod \"sequence-graph-5e366-554dbc9dc6-g4svh\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:27.287253 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.287227 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:45:27.411197 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.411168 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-proxy-tls\") pod \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\" (UID: \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\") " Apr 24 14:45:27.411368 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.411272 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-openshift-service-ca-bundle\") pod \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\" (UID: \"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41\") " Apr 24 14:45:27.411661 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.411641 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" (UID: "ca5d7317-a0a4-49fd-80e6-3ba1a237fc41"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:45:27.413253 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.413224 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" (UID: "ca5d7317-a0a4-49fd-80e6-3ba1a237fc41"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:45:27.511998 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.511968 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:45:27.511998 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.511999 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:45:27.713294 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.713190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls\") pod \"sequence-graph-5e366-554dbc9dc6-g4svh\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:27.715768 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.715745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls\") pod \"sequence-graph-5e366-554dbc9dc6-g4svh\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:27.756454 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.756420 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerID="ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb" exitCode=0 Apr 24 14:45:27.756600 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.756466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" event={"ID":"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41","Type":"ContainerDied","Data":"ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb"} Apr 24 14:45:27.756600 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.756488 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" event={"ID":"ca5d7317-a0a4-49fd-80e6-3ba1a237fc41","Type":"ContainerDied","Data":"b077d9353c5dd8e236f977294f403b27a91fdff40913b06e83c6af8f86099f75"} Apr 24 14:45:27.756600 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.756487 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8" Apr 24 14:45:27.756600 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.756508 2574 scope.go:117] "RemoveContainer" containerID="ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb" Apr 24 14:45:27.764430 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.764386 2574 scope.go:117] "RemoveContainer" containerID="ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb" Apr 24 14:45:27.764739 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:45:27.764711 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb\": container with ID starting with ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb not found: ID does not exist" containerID="ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb" Apr 24 14:45:27.764819 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.764747 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb"} err="failed to get container status \"ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb\": rpc error: code = NotFound desc = could not find container \"ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb\": container with ID starting with ea7b6d22e7d52004ef8d64f72274fc726d58be6ef48690de6368abdfd48568fb not found: ID does not exist" Apr 24 14:45:27.776000 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.775977 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8"] Apr 24 14:45:27.779177 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.779154 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0cfc3-6c948f8f4d-9c9p8"] Apr 24 14:45:27.967113 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:27.967029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:28.085377 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:28.085345 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh"] Apr 24 14:45:28.088261 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:45:28.088228 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a2af3b_8106_4f21_8808_342a33e535f6.slice/crio-88dfa9533fc237ddbc59f38ccd09dba6fe02ef10d17c9e4519dfc1c11d848c4f WatchSource:0}: Error finding container 88dfa9533fc237ddbc59f38ccd09dba6fe02ef10d17c9e4519dfc1c11d848c4f: Status 404 returned error can't find the container with id 88dfa9533fc237ddbc59f38ccd09dba6fe02ef10d17c9e4519dfc1c11d848c4f Apr 24 14:45:28.761655 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:28.761613 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" event={"ID":"b4a2af3b-8106-4f21-8808-342a33e535f6","Type":"ContainerStarted","Data":"1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282"} Apr 24 14:45:28.761655 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:28.761657 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" event={"ID":"b4a2af3b-8106-4f21-8808-342a33e535f6","Type":"ContainerStarted","Data":"88dfa9533fc237ddbc59f38ccd09dba6fe02ef10d17c9e4519dfc1c11d848c4f"} Apr 24 14:45:28.761912 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:28.761761 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:28.777336 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:28.777295 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" podStartSLOduration=1.777284181 podStartE2EDuration="1.777284181s" podCreationTimestamp="2026-04-24 14:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:45:28.774811786 +0000 UTC m=+1291.986663161" watchObservedRunningTime="2026-04-24 14:45:28.777284181 +0000 UTC m=+1291.989135599" Apr 24 14:45:29.359448 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:29.359415 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" path="/var/lib/kubelet/pods/ca5d7317-a0a4-49fd-80e6-3ba1a237fc41/volumes" Apr 24 14:45:34.771832 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:34.771794 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:37.118852 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:37.118816 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh"] Apr 24 14:45:37.119258 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:37.119010 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" containerID="cri-o://1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282" gracePeriod=30 Apr 24 14:45:39.770240 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:39.770203 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:44.769641 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:44.769603 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:49.769702 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:49.769664 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:49.770154 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:49.769775 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:45:54.770424 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:54.770369 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:45:59.769863 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:45:59.769819 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:46:04.769572 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:04.769532 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:46:07.303688 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.303667 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:46:07.327541 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.327496 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls\") pod \"b4a2af3b-8106-4f21-8808-342a33e535f6\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " Apr 24 14:46:07.327675 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.327627 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4a2af3b-8106-4f21-8808-342a33e535f6-openshift-service-ca-bundle\") pod \"b4a2af3b-8106-4f21-8808-342a33e535f6\" (UID: \"b4a2af3b-8106-4f21-8808-342a33e535f6\") " Apr 24 14:46:07.328029 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.328005 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a2af3b-8106-4f21-8808-342a33e535f6-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b4a2af3b-8106-4f21-8808-342a33e535f6" (UID: "b4a2af3b-8106-4f21-8808-342a33e535f6"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:46:07.329609 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.329587 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4a2af3b-8106-4f21-8808-342a33e535f6" (UID: "b4a2af3b-8106-4f21-8808-342a33e535f6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:46:07.359604 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.359576 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv"] Apr 24 14:46:07.359919 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.359906 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" Apr 24 14:46:07.359919 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.359919 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" Apr 24 14:46:07.360006 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.359933 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" Apr 24 14:46:07.360006 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.359940 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" Apr 24 14:46:07.360006 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.360002 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca5d7317-a0a4-49fd-80e6-3ba1a237fc41" containerName="ensemble-graph-0cfc3" Apr 24 14:46:07.360098 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.360013 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerName="sequence-graph-5e366" Apr 24 14:46:07.363046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.363031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:07.365031 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.365007 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-47293-kube-rbac-proxy-sar-config\"" Apr 24 14:46:07.365031 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.365007 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-47293-serving-cert\"" Apr 24 14:46:07.369016 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.368999 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv"] Apr 24 14:46:07.428204 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.428113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c955f6-67b9-443e-b41d-f525cf09c69d-openshift-service-ca-bundle\") pod \"ensemble-graph-47293-5d9c7b45db-4knmv\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:07.428204 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.428167 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls\") pod \"ensemble-graph-47293-5d9c7b45db-4knmv\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:07.428431 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.428298 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4a2af3b-8106-4f21-8808-342a33e535f6-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:46:07.428431 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.428326 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a2af3b-8106-4f21-8808-342a33e535f6-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:46:07.529142 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.529116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c955f6-67b9-443e-b41d-f525cf09c69d-openshift-service-ca-bundle\") pod \"ensemble-graph-47293-5d9c7b45db-4knmv\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:07.529320 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.529147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls\") pod \"ensemble-graph-47293-5d9c7b45db-4knmv\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:07.529320 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:46:07.529258 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-47293-serving-cert: secret "ensemble-graph-47293-serving-cert" not found Apr 24 14:46:07.529320 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:46:07.529309 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls podName:01c955f6-67b9-443e-b41d-f525cf09c69d nodeName:}" failed. No retries permitted until 2026-04-24 14:46:08.029293676 +0000 UTC m=+1331.241145033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls") pod "ensemble-graph-47293-5d9c7b45db-4knmv" (UID: "01c955f6-67b9-443e-b41d-f525cf09c69d") : secret "ensemble-graph-47293-serving-cert" not found Apr 24 14:46:07.529808 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.529785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c955f6-67b9-443e-b41d-f525cf09c69d-openshift-service-ca-bundle\") pod \"ensemble-graph-47293-5d9c7b45db-4knmv\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:07.899509 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.899471 2574 generic.go:358] "Generic (PLEG): container finished" podID="b4a2af3b-8106-4f21-8808-342a33e535f6" containerID="1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282" exitCode=137 Apr 24 14:46:07.899700 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.899534 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" event={"ID":"b4a2af3b-8106-4f21-8808-342a33e535f6","Type":"ContainerDied","Data":"1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282"} Apr 24 14:46:07.899700 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.899540 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" Apr 24 14:46:07.899700 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.899567 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh" event={"ID":"b4a2af3b-8106-4f21-8808-342a33e535f6","Type":"ContainerDied","Data":"88dfa9533fc237ddbc59f38ccd09dba6fe02ef10d17c9e4519dfc1c11d848c4f"} Apr 24 14:46:07.899700 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.899587 2574 scope.go:117] "RemoveContainer" containerID="1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282" Apr 24 14:46:07.908106 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.908086 2574 scope.go:117] "RemoveContainer" containerID="1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282" Apr 24 14:46:07.908352 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:46:07.908330 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282\": container with ID starting with 1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282 not found: ID does not exist" containerID="1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282" Apr 24 14:46:07.908473 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.908361 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282"} err="failed to get container status \"1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282\": rpc error: code = NotFound desc = could not find container \"1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282\": container with ID starting with 1e618dac05b8c27252a4e5d090b87e1270ffe41e696bfb23a874d33504b0d282 not found: ID does not exist" Apr 24 14:46:07.915362 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.915318 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh"] Apr 24 14:46:07.916916 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:07.916896 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5e366-554dbc9dc6-g4svh"] Apr 24 14:46:08.033269 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:08.033237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls\") pod \"ensemble-graph-47293-5d9c7b45db-4knmv\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:08.035751 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:08.035727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls\") pod \"ensemble-graph-47293-5d9c7b45db-4knmv\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:08.275451 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:08.275330 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:08.600487 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:08.600464 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv"] Apr 24 14:46:08.602981 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:46:08.602950 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c955f6_67b9_443e_b41d_f525cf09c69d.slice/crio-84936c200415d1026d69bdee5c1ad26308b6951a92eee8f7656908c8ed38a5d9 WatchSource:0}: Error finding container 84936c200415d1026d69bdee5c1ad26308b6951a92eee8f7656908c8ed38a5d9: Status 404 returned error can't find the container with id 84936c200415d1026d69bdee5c1ad26308b6951a92eee8f7656908c8ed38a5d9 Apr 24 14:46:08.904659 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:08.904562 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" event={"ID":"01c955f6-67b9-443e-b41d-f525cf09c69d","Type":"ContainerStarted","Data":"e5c3eee5cec300d7ccf0de08b42d6365728e885d155d10fc1d527df3304f5614"} Apr 24 14:46:08.904659 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:08.904601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" event={"ID":"01c955f6-67b9-443e-b41d-f525cf09c69d","Type":"ContainerStarted","Data":"84936c200415d1026d69bdee5c1ad26308b6951a92eee8f7656908c8ed38a5d9"} Apr 24 14:46:08.904849 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:08.904736 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:08.919935 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:08.919892 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" podStartSLOduration=1.919879811 podStartE2EDuration="1.919879811s" podCreationTimestamp="2026-04-24 14:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:46:08.918129183 +0000 UTC m=+1332.129980558" watchObservedRunningTime="2026-04-24 14:46:08.919879811 +0000 UTC m=+1332.131731186" Apr 24 14:46:09.362581 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:09.360586 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a2af3b-8106-4f21-8808-342a33e535f6" path="/var/lib/kubelet/pods/b4a2af3b-8106-4f21-8808-342a33e535f6/volumes" Apr 24 14:46:14.912658 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:14.912629 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:46:47.327007 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.326971 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw"] Apr 24 14:46:47.330354 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.330337 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:47.332072 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.332050 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-d5b84-serving-cert\"" Apr 24 14:46:47.332177 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.332131 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-d5b84-kube-rbac-proxy-sar-config\"" Apr 24 14:46:47.339457 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.339434 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw"] Apr 24 14:46:47.474355 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.474326 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/929d313a-04fb-4610-b558-5000fa032769-proxy-tls\") pod \"sequence-graph-d5b84-69cb89b9ff-82nhw\" (UID: \"929d313a-04fb-4610-b558-5000fa032769\") " pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:47.474544 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.474374 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/929d313a-04fb-4610-b558-5000fa032769-openshift-service-ca-bundle\") pod \"sequence-graph-d5b84-69cb89b9ff-82nhw\" (UID: \"929d313a-04fb-4610-b558-5000fa032769\") " pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:47.574887 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.574858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/929d313a-04fb-4610-b558-5000fa032769-proxy-tls\") pod \"sequence-graph-d5b84-69cb89b9ff-82nhw\" (UID: \"929d313a-04fb-4610-b558-5000fa032769\") " pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:47.575069 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.574910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/929d313a-04fb-4610-b558-5000fa032769-openshift-service-ca-bundle\") pod \"sequence-graph-d5b84-69cb89b9ff-82nhw\" (UID: \"929d313a-04fb-4610-b558-5000fa032769\") " pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:47.575608 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.575575 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/929d313a-04fb-4610-b558-5000fa032769-openshift-service-ca-bundle\") pod \"sequence-graph-d5b84-69cb89b9ff-82nhw\" (UID: \"929d313a-04fb-4610-b558-5000fa032769\") " pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:47.577157 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.577109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/929d313a-04fb-4610-b558-5000fa032769-proxy-tls\") pod \"sequence-graph-d5b84-69cb89b9ff-82nhw\" (UID: \"929d313a-04fb-4610-b558-5000fa032769\") " pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:47.640706 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.640675 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:47.974570 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:47.974492 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw"] Apr 24 14:46:47.977823 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:46:47.977798 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929d313a_04fb_4610_b558_5000fa032769.slice/crio-5aafdaf453703810be47ba7d3672fbeba1cf5efea7c09630283b068b8161e4d9 WatchSource:0}: Error finding container 5aafdaf453703810be47ba7d3672fbeba1cf5efea7c09630283b068b8161e4d9: Status 404 returned error can't find the container with id 5aafdaf453703810be47ba7d3672fbeba1cf5efea7c09630283b068b8161e4d9 Apr 24 14:46:48.037716 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:48.037690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" event={"ID":"929d313a-04fb-4610-b558-5000fa032769","Type":"ContainerStarted","Data":"5aafdaf453703810be47ba7d3672fbeba1cf5efea7c09630283b068b8161e4d9"} Apr 24 14:46:49.042764 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:49.042730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" event={"ID":"929d313a-04fb-4610-b558-5000fa032769","Type":"ContainerStarted","Data":"7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3"} Apr 24 14:46:49.043221 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:49.042853 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:46:49.058432 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:49.058373 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" podStartSLOduration=2.058361829 podStartE2EDuration="2.058361829s" podCreationTimestamp="2026-04-24 14:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:46:49.056342108 +0000 UTC m=+1372.268193483" watchObservedRunningTime="2026-04-24 14:46:49.058361829 +0000 UTC m=+1372.270213264" Apr 24 14:46:55.051495 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:46:55.051468 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:48:57.346046 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:48:57.346017 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:48:57.349430 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:48:57.349391 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:53:57.370111 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:53:57.370075 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:53:57.374128 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:53:57.374102 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:54:22.026625 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:22.026580 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv"] Apr 24 14:54:22.027092 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:22.026871 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" containerID="cri-o://e5c3eee5cec300d7ccf0de08b42d6365728e885d155d10fc1d527df3304f5614" gracePeriod=30 Apr 24 14:54:24.911777 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:24.911731 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:54:29.911900 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:29.911860 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:54:34.912313 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:34.912275 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:54:34.912733 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:34.912373 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:54:39.911842 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:39.911794 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:54:44.911523 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:44.911475 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:54:49.911592 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:49.911543 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:54:52.663245 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.663210 2574 generic.go:358] "Generic (PLEG): container finished" podID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerID="e5c3eee5cec300d7ccf0de08b42d6365728e885d155d10fc1d527df3304f5614" exitCode=0 Apr 24 14:54:52.663579 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.663271 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" event={"ID":"01c955f6-67b9-443e-b41d-f525cf09c69d","Type":"ContainerDied","Data":"e5c3eee5cec300d7ccf0de08b42d6365728e885d155d10fc1d527df3304f5614"} Apr 24 14:54:52.663579 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.663295 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" event={"ID":"01c955f6-67b9-443e-b41d-f525cf09c69d","Type":"ContainerDied","Data":"84936c200415d1026d69bdee5c1ad26308b6951a92eee8f7656908c8ed38a5d9"} Apr 24 14:54:52.663579 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.663305 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84936c200415d1026d69bdee5c1ad26308b6951a92eee8f7656908c8ed38a5d9" Apr 24 14:54:52.672919 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.672898 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:54:52.701968 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.701940 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c955f6-67b9-443e-b41d-f525cf09c69d-openshift-service-ca-bundle\") pod \"01c955f6-67b9-443e-b41d-f525cf09c69d\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " Apr 24 14:54:52.702117 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.701996 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls\") pod \"01c955f6-67b9-443e-b41d-f525cf09c69d\" (UID: \"01c955f6-67b9-443e-b41d-f525cf09c69d\") " Apr 24 14:54:52.702315 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.702295 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01c955f6-67b9-443e-b41d-f525cf09c69d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "01c955f6-67b9-443e-b41d-f525cf09c69d" (UID: "01c955f6-67b9-443e-b41d-f525cf09c69d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:54:52.704111 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.704090 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "01c955f6-67b9-443e-b41d-f525cf09c69d" (UID: "01c955f6-67b9-443e-b41d-f525cf09c69d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:54:52.803560 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.803533 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c955f6-67b9-443e-b41d-f525cf09c69d-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:54:52.803560 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:52.803561 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01c955f6-67b9-443e-b41d-f525cf09c69d-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:54:53.666241 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:53.666166 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv" Apr 24 14:54:53.679950 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:53.679922 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv"] Apr 24 14:54:53.682282 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:53.682257 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-47293-5d9c7b45db-4knmv"] Apr 24 14:54:55.359657 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:55.359621 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" path="/var/lib/kubelet/pods/01c955f6-67b9-443e-b41d-f525cf09c69d/volumes" Apr 24 14:54:57.422247 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:54:57.422216 2574 scope.go:117] "RemoveContainer" containerID="e5c3eee5cec300d7ccf0de08b42d6365728e885d155d10fc1d527df3304f5614" Apr 24 14:55:01.995659 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:01.995623 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw"] Apr 24 14:55:01.996136 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:01.995863 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" containerID="cri-o://7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3" gracePeriod=30 Apr 24 14:55:05.050790 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:05.050747 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:10.050857 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:10.050809 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:15.051047 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:15.051010 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:15.051456 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:15.051122 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:55:20.052348 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:20.052252 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:25.049935 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:25.049890 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:30.050676 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:30.050637 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:32.143895 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.143871 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:55:32.241966 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.241934 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/929d313a-04fb-4610-b558-5000fa032769-openshift-service-ca-bundle\") pod \"929d313a-04fb-4610-b558-5000fa032769\" (UID: \"929d313a-04fb-4610-b558-5000fa032769\") " Apr 24 14:55:32.242144 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.242026 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/929d313a-04fb-4610-b558-5000fa032769-proxy-tls\") pod \"929d313a-04fb-4610-b558-5000fa032769\" (UID: \"929d313a-04fb-4610-b558-5000fa032769\") " Apr 24 14:55:32.242279 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.242255 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929d313a-04fb-4610-b558-5000fa032769-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "929d313a-04fb-4610-b558-5000fa032769" (UID: "929d313a-04fb-4610-b558-5000fa032769"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:55:32.244585 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.244559 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/929d313a-04fb-4610-b558-5000fa032769-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "929d313a-04fb-4610-b558-5000fa032769" (UID: "929d313a-04fb-4610-b558-5000fa032769"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:55:32.245238 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.245219 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6"] Apr 24 14:55:32.245631 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.245612 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" Apr 24 14:55:32.245631 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.245630 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" Apr 24 14:55:32.245805 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.245645 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" Apr 24 14:55:32.245805 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.245651 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" Apr 24 14:55:32.245805 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.245738 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="01c955f6-67b9-443e-b41d-f525cf09c69d" containerName="ensemble-graph-47293" Apr 24 14:55:32.245805 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.245748 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="929d313a-04fb-4610-b558-5000fa032769" containerName="sequence-graph-d5b84" Apr 24 14:55:32.252317 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.252271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:32.254377 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.254354 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-b7286-serving-cert\"" Apr 24 14:55:32.254490 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.254354 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-b7286-kube-rbac-proxy-sar-config\"" Apr 24 14:55:32.254490 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.254461 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6"] Apr 24 14:55:32.343324 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.343293 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/929d313a-04fb-4610-b558-5000fa032769-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:55:32.343324 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.343320 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/929d313a-04fb-4610-b558-5000fa032769-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:55:32.444465 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.444434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6389fa6-a79a-447d-88d0-2e5f87941fa3-openshift-service-ca-bundle\") pod \"splitter-graph-b7286-654f5769bf-b9kf6\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:32.444465 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.444467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls\") pod \"splitter-graph-b7286-654f5769bf-b9kf6\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:32.545923 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.545816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6389fa6-a79a-447d-88d0-2e5f87941fa3-openshift-service-ca-bundle\") pod \"splitter-graph-b7286-654f5769bf-b9kf6\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:32.546100 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.546035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls\") pod \"splitter-graph-b7286-654f5769bf-b9kf6\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:32.546211 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:55:32.546187 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-b7286-serving-cert: secret "splitter-graph-b7286-serving-cert" not found Apr 24 14:55:32.546304 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:55:32.546288 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls podName:b6389fa6-a79a-447d-88d0-2e5f87941fa3 nodeName:}" failed. No retries permitted until 2026-04-24 14:55:33.046266631 +0000 UTC m=+1896.258117994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls") pod "splitter-graph-b7286-654f5769bf-b9kf6" (UID: "b6389fa6-a79a-447d-88d0-2e5f87941fa3") : secret "splitter-graph-b7286-serving-cert" not found Apr 24 14:55:32.546531 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.546514 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6389fa6-a79a-447d-88d0-2e5f87941fa3-openshift-service-ca-bundle\") pod \"splitter-graph-b7286-654f5769bf-b9kf6\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:32.797198 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.797107 2574 generic.go:358] "Generic (PLEG): container finished" podID="929d313a-04fb-4610-b558-5000fa032769" containerID="7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3" exitCode=0 Apr 24 14:55:32.797198 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.797169 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" Apr 24 14:55:32.797198 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.797191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" event={"ID":"929d313a-04fb-4610-b558-5000fa032769","Type":"ContainerDied","Data":"7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3"} Apr 24 14:55:32.797491 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.797225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw" event={"ID":"929d313a-04fb-4610-b558-5000fa032769","Type":"ContainerDied","Data":"5aafdaf453703810be47ba7d3672fbeba1cf5efea7c09630283b068b8161e4d9"} Apr 24 14:55:32.797491 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.797243 2574 scope.go:117] "RemoveContainer" containerID="7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3" Apr 24 14:55:32.806200 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.806179 2574 scope.go:117] "RemoveContainer" containerID="7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3" Apr 24 14:55:32.806790 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:55:32.806767 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3\": container with ID starting with 7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3 not found: ID does not exist" containerID="7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3" Apr 24 14:55:32.806988 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.806800 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3"} err="failed to get container status \"7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3\": rpc error: code = NotFound desc = could not find container \"7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3\": container with ID starting with 7408913db075ab69a68e8af03b8fc42de7b096c4b51548187c07482f5fd31eb3 not found: ID does not exist" Apr 24 14:55:32.816776 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.816753 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw"] Apr 24 14:55:32.819003 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:32.818983 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d5b84-69cb89b9ff-82nhw"] Apr 24 14:55:33.050758 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.050665 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls\") pod \"splitter-graph-b7286-654f5769bf-b9kf6\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:33.053038 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.053016 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls\") pod \"splitter-graph-b7286-654f5769bf-b9kf6\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:33.163174 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.163139 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:33.282054 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.282018 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6"] Apr 24 14:55:33.285127 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:55:33.285101 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6389fa6_a79a_447d_88d0_2e5f87941fa3.slice/crio-5b87ece7b6e9b7b0585c95346d02f64dc222161a43c285131ce3105fb3f09294 WatchSource:0}: Error finding container 5b87ece7b6e9b7b0585c95346d02f64dc222161a43c285131ce3105fb3f09294: Status 404 returned error can't find the container with id 5b87ece7b6e9b7b0585c95346d02f64dc222161a43c285131ce3105fb3f09294 Apr 24 14:55:33.286865 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.286848 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:55:33.359184 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.359156 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="929d313a-04fb-4610-b558-5000fa032769" path="/var/lib/kubelet/pods/929d313a-04fb-4610-b558-5000fa032769/volumes" Apr 24 14:55:33.801427 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.801310 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" event={"ID":"b6389fa6-a79a-447d-88d0-2e5f87941fa3","Type":"ContainerStarted","Data":"b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42"} Apr 24 14:55:33.801427 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.801349 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" event={"ID":"b6389fa6-a79a-447d-88d0-2e5f87941fa3","Type":"ContainerStarted","Data":"5b87ece7b6e9b7b0585c95346d02f64dc222161a43c285131ce3105fb3f09294"} Apr 24 14:55:33.801662 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.801431 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:33.815675 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:33.815623 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" podStartSLOduration=1.81560888 podStartE2EDuration="1.81560888s" podCreationTimestamp="2026-04-24 14:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:55:33.815131198 +0000 UTC m=+1897.026982573" watchObservedRunningTime="2026-04-24 14:55:33.81560888 +0000 UTC m=+1897.027460268" Apr 24 14:55:39.810987 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:39.810951 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:42.302730 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:42.302701 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6"] Apr 24 14:55:42.303144 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:42.302941 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" containerID="cri-o://b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42" gracePeriod=30 Apr 24 14:55:44.809929 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:44.809881 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:49.810313 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:49.810271 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:54.810189 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:54.810153 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:55:54.810554 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:54.810263 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:55:59.809640 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:55:59.809594 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:56:04.809894 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:04.809857 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:56:09.809852 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:09.809809 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:56:12.202349 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.202312 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr"] Apr 24 14:56:12.206806 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.206788 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:12.208820 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.208800 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-67b80-kube-rbac-proxy-sar-config\"" Apr 24 14:56:12.208883 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.208799 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-67b80-serving-cert\"" Apr 24 14:56:12.214688 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.214545 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr"] Apr 24 14:56:12.261167 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.261134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls\") pod \"switch-graph-67b80-76b69489d8-fzvmr\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:12.261313 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.261215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-openshift-service-ca-bundle\") pod \"switch-graph-67b80-76b69489d8-fzvmr\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:12.334899 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:56:12.334866 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6389fa6_a79a_447d_88d0_2e5f87941fa3.slice/crio-conmon-b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42.scope\": RecentStats: unable to find data in memory cache]" Apr 24 14:56:12.335047 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:56:12.335024 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6389fa6_a79a_447d_88d0_2e5f87941fa3.slice/crio-b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42.scope\": RecentStats: unable to find data in memory cache]" Apr 24 14:56:12.362003 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.361967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls\") pod \"switch-graph-67b80-76b69489d8-fzvmr\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:12.362133 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.362051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-openshift-service-ca-bundle\") pod \"switch-graph-67b80-76b69489d8-fzvmr\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:12.362133 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:56:12.362094 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-67b80-serving-cert: secret "switch-graph-67b80-serving-cert" not found Apr 24 14:56:12.362218 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:56:12.362176 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls podName:9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20 nodeName:}" failed. No retries permitted until 2026-04-24 14:56:12.862157765 +0000 UTC m=+1936.074009118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls") pod "switch-graph-67b80-76b69489d8-fzvmr" (UID: "9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20") : secret "switch-graph-67b80-serving-cert" not found Apr 24 14:56:12.362726 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.362702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-openshift-service-ca-bundle\") pod \"switch-graph-67b80-76b69489d8-fzvmr\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:12.454727 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.454673 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:56:12.563519 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.563487 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls\") pod \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " Apr 24 14:56:12.563666 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.563535 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6389fa6-a79a-447d-88d0-2e5f87941fa3-openshift-service-ca-bundle\") pod \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\" (UID: \"b6389fa6-a79a-447d-88d0-2e5f87941fa3\") " Apr 24 14:56:12.563915 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.563893 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6389fa6-a79a-447d-88d0-2e5f87941fa3-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b6389fa6-a79a-447d-88d0-2e5f87941fa3" (UID: "b6389fa6-a79a-447d-88d0-2e5f87941fa3"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:56:12.565372 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.565349 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b6389fa6-a79a-447d-88d0-2e5f87941fa3" (UID: "b6389fa6-a79a-447d-88d0-2e5f87941fa3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:56:12.665121 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.665097 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6389fa6-a79a-447d-88d0-2e5f87941fa3-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:56:12.665121 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.665119 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6389fa6-a79a-447d-88d0-2e5f87941fa3-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 14:56:12.867479 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.867445 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls\") pod \"switch-graph-67b80-76b69489d8-fzvmr\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:12.869753 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.869728 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls\") pod \"switch-graph-67b80-76b69489d8-fzvmr\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:12.947695 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.947658 2574 generic.go:358] "Generic (PLEG): container finished" podID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerID="b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42" exitCode=0 Apr 24 14:56:12.947868 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.947730 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" Apr 24 14:56:12.947868 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.947739 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" event={"ID":"b6389fa6-a79a-447d-88d0-2e5f87941fa3","Type":"ContainerDied","Data":"b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42"} Apr 24 14:56:12.947868 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.947773 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6" event={"ID":"b6389fa6-a79a-447d-88d0-2e5f87941fa3","Type":"ContainerDied","Data":"5b87ece7b6e9b7b0585c95346d02f64dc222161a43c285131ce3105fb3f09294"} Apr 24 14:56:12.947868 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.947789 2574 scope.go:117] "RemoveContainer" containerID="b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42" Apr 24 14:56:12.956378 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.956357 2574 scope.go:117] "RemoveContainer" containerID="b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42" Apr 24 14:56:12.956647 ip-10-0-131-216 kubenswrapper[2574]: E0424 14:56:12.956626 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42\": container with ID starting with b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42 not found: ID does not exist" containerID="b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42" Apr 24 14:56:12.956734 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.956653 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42"} err="failed to get container status \"b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42\": rpc error: code = NotFound desc = could not find container \"b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42\": container with ID starting with b8194f698b41c2937e61f24e503b2543f30cb8924b90445aa81293df8f867d42 not found: ID does not exist" Apr 24 14:56:12.967489 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.967430 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6"] Apr 24 14:56:12.971198 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:12.971179 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-b7286-654f5769bf-b9kf6"] Apr 24 14:56:13.118871 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:13.118789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:13.236811 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:13.236785 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr"] Apr 24 14:56:13.239097 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:56:13.239073 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ba5a709_c03c_4b4a_b2aa_92bf08bf3e20.slice/crio-035daf5ca617ebce12cd708fc92f356db4b0f61cea7a41ae548e43786bd48538 WatchSource:0}: Error finding container 035daf5ca617ebce12cd708fc92f356db4b0f61cea7a41ae548e43786bd48538: Status 404 returned error can't find the container with id 035daf5ca617ebce12cd708fc92f356db4b0f61cea7a41ae548e43786bd48538 Apr 24 14:56:13.358653 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:13.358620 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" path="/var/lib/kubelet/pods/b6389fa6-a79a-447d-88d0-2e5f87941fa3/volumes" Apr 24 14:56:13.955899 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:13.955863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" event={"ID":"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20","Type":"ContainerStarted","Data":"e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492"} Apr 24 14:56:13.955899 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:13.955897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" event={"ID":"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20","Type":"ContainerStarted","Data":"035daf5ca617ebce12cd708fc92f356db4b0f61cea7a41ae548e43786bd48538"} Apr 24 14:56:13.956158 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:13.955972 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:13.971054 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:13.971011 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" podStartSLOduration=1.9709993369999999 podStartE2EDuration="1.970999337s" podCreationTimestamp="2026-04-24 14:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:56:13.969338397 +0000 UTC m=+1937.181189772" watchObservedRunningTime="2026-04-24 14:56:13.970999337 +0000 UTC m=+1937.182850712" Apr 24 14:56:19.965270 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:19.965239 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 14:56:52.509950 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.509862 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7"] Apr 24 14:56:52.510368 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.510258 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" Apr 24 14:56:52.510368 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.510272 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" Apr 24 14:56:52.510368 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.510337 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6389fa6-a79a-447d-88d0-2e5f87941fa3" containerName="splitter-graph-b7286" Apr 24 14:56:52.513881 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.513854 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:52.515757 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.515703 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-9e32c-kube-rbac-proxy-sar-config\"" Apr 24 14:56:52.515896 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.515851 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-9e32c-serving-cert\"" Apr 24 14:56:52.523816 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.523787 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7"] Apr 24 14:56:52.587432 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.587382 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de515517-c96a-4511-ae1b-8f5dda616c37-openshift-service-ca-bundle\") pod \"splitter-graph-9e32c-776f64f998-nnrn7\" (UID: \"de515517-c96a-4511-ae1b-8f5dda616c37\") " pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:52.587597 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.587466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de515517-c96a-4511-ae1b-8f5dda616c37-proxy-tls\") pod \"splitter-graph-9e32c-776f64f998-nnrn7\" (UID: \"de515517-c96a-4511-ae1b-8f5dda616c37\") " pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:52.688807 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.688772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de515517-c96a-4511-ae1b-8f5dda616c37-openshift-service-ca-bundle\") pod \"splitter-graph-9e32c-776f64f998-nnrn7\" (UID: \"de515517-c96a-4511-ae1b-8f5dda616c37\") " pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:52.688978 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.688834 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de515517-c96a-4511-ae1b-8f5dda616c37-proxy-tls\") pod \"splitter-graph-9e32c-776f64f998-nnrn7\" (UID: \"de515517-c96a-4511-ae1b-8f5dda616c37\") " pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:52.689413 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.689374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de515517-c96a-4511-ae1b-8f5dda616c37-openshift-service-ca-bundle\") pod \"splitter-graph-9e32c-776f64f998-nnrn7\" (UID: \"de515517-c96a-4511-ae1b-8f5dda616c37\") " pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:52.691133 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.691116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de515517-c96a-4511-ae1b-8f5dda616c37-proxy-tls\") pod \"splitter-graph-9e32c-776f64f998-nnrn7\" (UID: \"de515517-c96a-4511-ae1b-8f5dda616c37\") " pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:52.829200 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.829168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:52.951190 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:52.950904 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7"] Apr 24 14:56:52.953879 ip-10-0-131-216 kubenswrapper[2574]: W0424 14:56:52.953849 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde515517_c96a_4511_ae1b_8f5dda616c37.slice/crio-f850c04e5729106326b819cd629094bd5c22d22c04d607b9cb23ab6b4a45b636 WatchSource:0}: Error finding container f850c04e5729106326b819cd629094bd5c22d22c04d607b9cb23ab6b4a45b636: Status 404 returned error can't find the container with id f850c04e5729106326b819cd629094bd5c22d22c04d607b9cb23ab6b4a45b636 Apr 24 14:56:53.087109 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:53.087017 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" event={"ID":"de515517-c96a-4511-ae1b-8f5dda616c37","Type":"ContainerStarted","Data":"7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24"} Apr 24 14:56:53.087109 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:53.087055 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" event={"ID":"de515517-c96a-4511-ae1b-8f5dda616c37","Type":"ContainerStarted","Data":"f850c04e5729106326b819cd629094bd5c22d22c04d607b9cb23ab6b4a45b636"} Apr 24 14:56:53.087290 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:53.087148 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:56:53.101747 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:53.101698 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" podStartSLOduration=1.101682547 podStartE2EDuration="1.101682547s" podCreationTimestamp="2026-04-24 14:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:56:53.100037878 +0000 UTC m=+1976.311889253" watchObservedRunningTime="2026-04-24 14:56:53.101682547 +0000 UTC m=+1976.313533922" Apr 24 14:56:59.096075 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:56:59.096046 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 14:58:57.393741 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:58:57.393703 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 14:58:57.398367 ip-10-0-131-216 kubenswrapper[2574]: I0424 14:58:57.398343 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 15:03:57.417989 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:03:57.417873 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 15:03:57.423374 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:03:57.423354 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 15:05:07.194510 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:07.194477 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7"] Apr 24 15:05:07.195096 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:07.194721 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" containerID="cri-o://7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24" gracePeriod=30 Apr 24 15:05:09.094254 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:09.094217 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:05:14.093858 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:14.093819 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:05:19.094448 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:19.094388 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:05:19.094831 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:19.094515 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 15:05:24.093804 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:24.093763 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:05:29.093607 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:29.093567 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:05:34.093829 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:34.093783 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:05:37.335165 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.335137 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 15:05:37.407358 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.407320 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de515517-c96a-4511-ae1b-8f5dda616c37-openshift-service-ca-bundle\") pod \"de515517-c96a-4511-ae1b-8f5dda616c37\" (UID: \"de515517-c96a-4511-ae1b-8f5dda616c37\") " Apr 24 15:05:37.407358 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.407362 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de515517-c96a-4511-ae1b-8f5dda616c37-proxy-tls\") pod \"de515517-c96a-4511-ae1b-8f5dda616c37\" (UID: \"de515517-c96a-4511-ae1b-8f5dda616c37\") " Apr 24 15:05:37.407978 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.407948 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de515517-c96a-4511-ae1b-8f5dda616c37-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "de515517-c96a-4511-ae1b-8f5dda616c37" (UID: "de515517-c96a-4511-ae1b-8f5dda616c37"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 15:05:37.409724 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.409699 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de515517-c96a-4511-ae1b-8f5dda616c37-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "de515517-c96a-4511-ae1b-8f5dda616c37" (UID: "de515517-c96a-4511-ae1b-8f5dda616c37"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 15:05:37.508917 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.508832 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de515517-c96a-4511-ae1b-8f5dda616c37-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 15:05:37.508917 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.508873 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de515517-c96a-4511-ae1b-8f5dda616c37-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 15:05:37.870271 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.870233 2574 generic.go:358] "Generic (PLEG): container finished" podID="de515517-c96a-4511-ae1b-8f5dda616c37" containerID="7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24" exitCode=0 Apr 24 15:05:37.870472 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.870305 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" Apr 24 15:05:37.870472 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.870324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" event={"ID":"de515517-c96a-4511-ae1b-8f5dda616c37","Type":"ContainerDied","Data":"7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24"} Apr 24 15:05:37.870472 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.870366 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7" event={"ID":"de515517-c96a-4511-ae1b-8f5dda616c37","Type":"ContainerDied","Data":"f850c04e5729106326b819cd629094bd5c22d22c04d607b9cb23ab6b4a45b636"} Apr 24 15:05:37.870472 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.870386 2574 scope.go:117] "RemoveContainer" containerID="7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24" Apr 24 15:05:37.881483 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.879365 2574 scope.go:117] "RemoveContainer" containerID="7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24" Apr 24 15:05:37.881483 ip-10-0-131-216 kubenswrapper[2574]: E0424 15:05:37.881444 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24\": container with ID starting with 7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24 not found: ID does not exist" containerID="7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24" Apr 24 15:05:37.881483 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.881472 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24"} err="failed to get container status \"7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24\": rpc error: code = NotFound desc = could not find container \"7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24\": container with ID starting with 7004ce92beb8a46e94e0f485b28cc0a8da6f2019ff1ec52bb0139dfc8d761b24 not found: ID does not exist" Apr 24 15:05:37.889574 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.889555 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7"] Apr 24 15:05:37.893515 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:37.893494 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9e32c-776f64f998-nnrn7"] Apr 24 15:05:39.359781 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:05:39.359750 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" path="/var/lib/kubelet/pods/de515517-c96a-4511-ae1b-8f5dda616c37/volumes" Apr 24 15:08:57.444318 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:08:57.444192 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 15:08:57.449163 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:08:57.449143 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 15:12:31.464583 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:31.464549 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr"] Apr 24 15:12:31.465137 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:31.464849 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" containerID="cri-o://e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492" gracePeriod=30 Apr 24 15:12:34.963618 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:34.963577 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:12:39.964122 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:39.964083 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:12:44.963858 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:44.963818 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:12:44.964224 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:44.963917 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 15:12:46.936323 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:46.936294 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:47.696323 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:47.696287 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:48.442039 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:48.442008 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:49.170723 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:49.170697 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:49.887875 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:49.887848 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:49.963703 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:49.963670 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:12:50.611534 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:50.611501 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:51.310142 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:51.310114 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:52.016439 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:52.016392 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:52.739752 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:52.739724 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:53.463053 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:53.463025 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:54.206563 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:54.206527 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:54.964005 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:54.963968 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:12:54.973952 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:54.973927 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-67b80-76b69489d8-fzvmr_9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/switch-graph-67b80/0.log" Apr 24 15:12:59.901440 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:59.901412 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bxsfm_5a7584ad-141f-4b72-9c3f-f44d38325431/global-pull-secret-syncer/0.log" Apr 24 15:12:59.963559 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:12:59.963518 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 15:13:00.023902 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:00.023869 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kdc8j_0693b486-a773-4145-885d-daf067f39c8c/konnectivity-agent/0.log" Apr 24 15:13:00.117155 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:00.117129 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-216.ec2.internal_d10d73caf323405e00defaf97a76c78f/haproxy/0.log" Apr 24 15:13:01.626474 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:01.626451 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 15:13:01.719459 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:01.719359 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls\") pod \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " Apr 24 15:13:01.719616 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:01.719503 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-openshift-service-ca-bundle\") pod \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\" (UID: \"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20\") " Apr 24 15:13:01.719860 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:01.719834 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" (UID: "9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 15:13:01.721311 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:01.721294 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" (UID: "9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 15:13:01.819965 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:01.819926 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-openshift-service-ca-bundle\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 15:13:01.819965 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:01.819960 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20-proxy-tls\") on node \"ip-10-0-131-216.ec2.internal\" DevicePath \"\"" Apr 24 15:13:02.330958 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.330920 2574 generic.go:358] "Generic (PLEG): container finished" podID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerID="e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492" exitCode=0 Apr 24 15:13:02.331170 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.331005 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" Apr 24 15:13:02.331170 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.331018 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" event={"ID":"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20","Type":"ContainerDied","Data":"e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492"} Apr 24 15:13:02.331170 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.331066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr" event={"ID":"9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20","Type":"ContainerDied","Data":"035daf5ca617ebce12cd708fc92f356db4b0f61cea7a41ae548e43786bd48538"} Apr 24 15:13:02.331170 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.331086 2574 scope.go:117] "RemoveContainer" containerID="e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492" Apr 24 15:13:02.340044 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.340028 2574 scope.go:117] "RemoveContainer" containerID="e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492" Apr 24 15:13:02.340295 ip-10-0-131-216 kubenswrapper[2574]: E0424 15:13:02.340269 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492\": container with ID starting with e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492 not found: ID does not exist" containerID="e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492" Apr 24 15:13:02.340381 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.340300 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492"} err="failed to get container status \"e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492\": rpc error: code = NotFound desc = could not find container \"e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492\": container with ID starting with e1a28f7a2c23671067344c92b215bf259a30a3d617d7d92a691884c33c02a492 not found: ID does not exist" Apr 24 15:13:02.351335 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.351308 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr"] Apr 24 15:13:02.352809 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:02.352786 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-67b80-76b69489d8-fzvmr"] Apr 24 15:13:03.359888 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.359856 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" path="/var/lib/kubelet/pods/9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20/volumes" Apr 24 15:13:03.504310 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.504287 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8715f464-8cc2-459a-8616-97623080dd16/alertmanager/0.log" Apr 24 15:13:03.529049 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.529023 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8715f464-8cc2-459a-8616-97623080dd16/config-reloader/0.log" Apr 24 15:13:03.555601 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.555573 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8715f464-8cc2-459a-8616-97623080dd16/kube-rbac-proxy-web/0.log" Apr 24 15:13:03.579526 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.579496 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8715f464-8cc2-459a-8616-97623080dd16/kube-rbac-proxy/0.log" Apr 24 15:13:03.602092 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.602069 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8715f464-8cc2-459a-8616-97623080dd16/kube-rbac-proxy-metric/0.log" Apr 24 15:13:03.628383 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.628317 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8715f464-8cc2-459a-8616-97623080dd16/prom-label-proxy/0.log" Apr 24 15:13:03.652705 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.652681 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8715f464-8cc2-459a-8616-97623080dd16/init-config-reloader/0.log" Apr 24 15:13:03.729008 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.728978 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-drv2m_7d61f999-ef5b-4a64-b56f-54f94755779c/kube-state-metrics/0.log" Apr 24 15:13:03.752527 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.752500 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-drv2m_7d61f999-ef5b-4a64-b56f-54f94755779c/kube-rbac-proxy-main/0.log" Apr 24 15:13:03.775907 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.775884 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-drv2m_7d61f999-ef5b-4a64-b56f-54f94755779c/kube-rbac-proxy-self/0.log" Apr 24 15:13:03.826295 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:03.826269 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-cltlb_2539fa5c-3160-43bd-a351-0184602b72e3/monitoring-plugin/0.log" Apr 24 15:13:04.016256 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.016181 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xrfbk_990c1e6d-4603-492a-b0d1-b0d498ef3c6e/node-exporter/0.log" Apr 24 15:13:04.035376 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.035354 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xrfbk_990c1e6d-4603-492a-b0d1-b0d498ef3c6e/kube-rbac-proxy/0.log" Apr 24 15:13:04.054932 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.054906 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xrfbk_990c1e6d-4603-492a-b0d1-b0d498ef3c6e/init-textfile/0.log" Apr 24 15:13:04.084701 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.084682 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t8fcs_ca3432a7-7fcd-4793-933f-b84d886dc761/kube-rbac-proxy-main/0.log" Apr 24 15:13:04.105324 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.105302 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t8fcs_ca3432a7-7fcd-4793-933f-b84d886dc761/kube-rbac-proxy-self/0.log" Apr 24 15:13:04.129324 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.129303 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t8fcs_ca3432a7-7fcd-4793-933f-b84d886dc761/openshift-state-metrics/0.log" Apr 24 15:13:04.337943 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.337916 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ls5jw_9374d6dc-31b7-464b-a614-4cd5ce83fdbb/prometheus-operator/0.log" Apr 24 15:13:04.355381 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.355353 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ls5jw_9374d6dc-31b7-464b-a614-4cd5ce83fdbb/kube-rbac-proxy/0.log" Apr 24 15:13:04.397373 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:04.397347 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-z2grl_2a817927-9d20-4e56-a0bf-0223603b5b85/prometheus-operator-admission-webhook/0.log" Apr 24 15:13:05.821809 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:05.821769 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-hwdq2_0ad0da81-ab22-438d-911a-36e1a74dba1f/networking-console-plugin/0.log" Apr 24 15:13:06.261809 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:06.261778 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/2.log" Apr 24 15:13:06.265841 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:06.265817 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v7trz_ff3b99d4-3afa-4687-b6b7-7d3526edbcf4/console-operator/3.log" Apr 24 15:13:06.646148 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:06.646120 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75fdf897bc-4t9gp_ce5e9ad3-fbed-43af-a12c-82685ad45427/console/0.log" Apr 24 15:13:06.676238 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:06.676213 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-xcrgl_149cb833-9aef-4e87-9532-449279ed8f7e/download-server/0.log" Apr 24 15:13:07.062979 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.062950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-nx84z_c32378d6-79f4-4462-a6bc-310eaafe2cac/volume-data-source-validator/0.log" Apr 24 15:13:07.682079 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.682048 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2ggts_3f0062e0-6c81-4d0d-a829-f8f572d6038e/dns/0.log" Apr 24 15:13:07.709577 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.709538 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2ggts_3f0062e0-6c81-4d0d-a829-f8f572d6038e/kube-rbac-proxy/0.log" Apr 24 15:13:07.727065 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.727037 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck"] Apr 24 15:13:07.727381 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.727369 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" Apr 24 15:13:07.727457 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.727382 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" Apr 24 15:13:07.727457 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.727420 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" Apr 24 15:13:07.727457 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.727426 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" Apr 24 15:13:07.727568 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.727482 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ba5a709-c03c-4b4a-b2aa-92bf08bf3e20" containerName="switch-graph-67b80" Apr 24 15:13:07.727568 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.727492 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="de515517-c96a-4511-ae1b-8f5dda616c37" containerName="splitter-graph-9e32c" Apr 24 15:13:07.730525 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.730506 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.732384 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.732363 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8f8q8\"/\"default-dockercfg-p8tcf\"" Apr 24 15:13:07.732501 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.732384 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8f8q8\"/\"openshift-service-ca.crt\"" Apr 24 15:13:07.732913 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.732898 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8f8q8\"/\"kube-root-ca.crt\"" Apr 24 15:13:07.738764 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.738742 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck"] Apr 24 15:13:07.867932 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.867894 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-podres\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.867932 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.867934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-sys\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.868163 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.867953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-proc\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.868163 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.868051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-lib-modules\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.868163 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.868105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twmrj\" (UniqueName: \"kubernetes.io/projected/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-kube-api-access-twmrj\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.880880 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.880849 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p8qh5_44a82b31-abfc-4f70-a1e3-54ed41d48cf7/dns-node-resolver/0.log" Apr 24 15:13:07.969085 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.968999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-podres\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.969085 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.969037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-sys\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.969085 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.969056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-proc\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.969318 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.969105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-lib-modules\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.969318 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.969130 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-sys\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.969318 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.969146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twmrj\" (UniqueName: \"kubernetes.io/projected/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-kube-api-access-twmrj\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.969318 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.969154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-podres\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.969318 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.969205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-proc\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.969318 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.969227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-lib-modules\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:07.975942 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:07.975915 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twmrj\" (UniqueName: \"kubernetes.io/projected/606a75d8-5a68-47b4-99a1-b62b5a7d6f70-kube-api-access-twmrj\") pod \"perf-node-gather-daemonset-sllck\" (UID: \"606a75d8-5a68-47b4-99a1-b62b5a7d6f70\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:08.041061 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:08.041025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:08.161132 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:08.160980 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck"] Apr 24 15:13:08.163748 ip-10-0-131-216 kubenswrapper[2574]: W0424 15:13:08.163721 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod606a75d8_5a68_47b4_99a1_b62b5a7d6f70.slice/crio-0bfcfebf8ffdaca90a87340627c8b07474f08d46f6a6821c73a196543b1f5e54 WatchSource:0}: Error finding container 0bfcfebf8ffdaca90a87340627c8b07474f08d46f6a6821c73a196543b1f5e54: Status 404 returned error can't find the container with id 0bfcfebf8ffdaca90a87340627c8b07474f08d46f6a6821c73a196543b1f5e54 Apr 24 15:13:08.165366 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:08.165350 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:13:08.351958 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:08.349532 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-95f8z_52546bac-718f-4f97-8b34-9a2e8efca7e8/node-ca/0.log" Apr 24 15:13:08.353673 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:08.353640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" event={"ID":"606a75d8-5a68-47b4-99a1-b62b5a7d6f70","Type":"ContainerStarted","Data":"da71be364d512e0948fd8aa98a8bf1012c63f9b10bf0e7315005ea2558f54bdc"} Apr 24 15:13:08.353673 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:08.353676 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" event={"ID":"606a75d8-5a68-47b4-99a1-b62b5a7d6f70","Type":"ContainerStarted","Data":"0bfcfebf8ffdaca90a87340627c8b07474f08d46f6a6821c73a196543b1f5e54"} Apr 24 15:13:08.353865 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:08.353728 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:08.367741 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:08.367692 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" podStartSLOduration=1.36767926 podStartE2EDuration="1.36767926s" podCreationTimestamp="2026-04-24 15:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:13:08.365695971 +0000 UTC m=+2951.577547345" watchObservedRunningTime="2026-04-24 15:13:08.36767926 +0000 UTC m=+2951.579530668" Apr 24 15:13:09.099983 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:09.099958 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-577fb5f5fd-t2ghs_8a1f01af-d685-4103-bebf-0d55fcb83c35/router/0.log" Apr 24 15:13:09.456571 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:09.456489 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tvx6q_cf952f8e-c033-4ad1-a839-92bb755b49cc/serve-healthcheck-canary/0.log" Apr 24 15:13:09.925092 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:09.925056 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4tm5_d7b5bf76-d52c-41a2-8bd9-53cbd963751d/kube-rbac-proxy/0.log" Apr 24 15:13:09.943601 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:09.943573 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4tm5_d7b5bf76-d52c-41a2-8bd9-53cbd963751d/exporter/0.log" Apr 24 15:13:09.962172 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:09.962135 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4tm5_d7b5bf76-d52c-41a2-8bd9-53cbd963751d/extractor/0.log" Apr 24 15:13:11.891950 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:11.891909 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-b7dc77d59-vqgzh_792f6b82-6c33-4584-8dbe-1c85ac9dae57/manager/0.log" Apr 24 15:13:11.911172 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:11.911147 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-sdc5d_b3645a36-7d16-485f-9a64-ff6b9ca03d7e/manager/0.log" Apr 24 15:13:12.169059 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:12.168977 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-phfpn_014c6264-9821-4227-a1a7-b8a5505a05e8/manager/0.log" Apr 24 15:13:14.367259 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:14.367228 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-sllck" Apr 24 15:13:17.219920 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.219888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8fqct_cce8ff4e-ca5b-4965-8469-359bef8e6cbe/kube-multus-additional-cni-plugins/0.log" Apr 24 15:13:17.241499 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.241473 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8fqct_cce8ff4e-ca5b-4965-8469-359bef8e6cbe/egress-router-binary-copy/0.log" Apr 24 15:13:17.263676 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.263650 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8fqct_cce8ff4e-ca5b-4965-8469-359bef8e6cbe/cni-plugins/0.log" Apr 24 15:13:17.283081 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.283055 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8fqct_cce8ff4e-ca5b-4965-8469-359bef8e6cbe/bond-cni-plugin/0.log" Apr 24 15:13:17.304340 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.304319 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8fqct_cce8ff4e-ca5b-4965-8469-359bef8e6cbe/routeoverride-cni/0.log" Apr 24 15:13:17.367042 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.367013 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8fqct_cce8ff4e-ca5b-4965-8469-359bef8e6cbe/whereabouts-cni-bincopy/0.log" Apr 24 15:13:17.401667 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.401638 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8fqct_cce8ff4e-ca5b-4965-8469-359bef8e6cbe/whereabouts-cni/0.log" Apr 24 15:13:17.837159 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.837133 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zxsvs_a74b1a4d-a0a7-4742-a775-7a58e287b451/kube-multus/0.log" Apr 24 15:13:17.954956 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.954928 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n65kf_a216968f-e7d3-4145-b877-dbf4cfe8277a/network-metrics-daemon/0.log" Apr 24 15:13:17.973205 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:17.973178 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n65kf_a216968f-e7d3-4145-b877-dbf4cfe8277a/kube-rbac-proxy/0.log" Apr 24 15:13:19.384234 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:19.384209 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wbvmc_7c03ae59-e276-4d40-960a-9f006b958f5e/ovn-controller/0.log" Apr 24 15:13:19.413835 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:19.413807 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wbvmc_7c03ae59-e276-4d40-960a-9f006b958f5e/ovn-acl-logging/0.log" Apr 24 15:13:19.430781 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:19.430757 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wbvmc_7c03ae59-e276-4d40-960a-9f006b958f5e/kube-rbac-proxy-node/0.log" Apr 24 15:13:19.449703 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:19.449678 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wbvmc_7c03ae59-e276-4d40-960a-9f006b958f5e/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 15:13:19.467589 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:19.467566 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wbvmc_7c03ae59-e276-4d40-960a-9f006b958f5e/northd/0.log" Apr 24 15:13:19.486241 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:19.486218 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wbvmc_7c03ae59-e276-4d40-960a-9f006b958f5e/nbdb/0.log" Apr 24 15:13:19.505033 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:19.505012 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wbvmc_7c03ae59-e276-4d40-960a-9f006b958f5e/sbdb/0.log" Apr 24 15:13:19.596752 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:19.596722 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wbvmc_7c03ae59-e276-4d40-960a-9f006b958f5e/ovnkube-controller/0.log" Apr 24 15:13:20.524240 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:20.524206 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-qttrj_8c41cff4-707d-4fea-a2c7-1c8e2bc39fb3/check-endpoints/0.log" Apr 24 15:13:20.547806 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:20.547783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6cqpp_1de9757c-c280-4900-b19e-6918d88ee51e/network-check-target-container/0.log" Apr 24 15:13:21.472440 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:21.472409 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7x8dx_51de9bc5-cce7-429e-881f-d12cdc08346f/iptables-alerter/0.log" Apr 24 15:13:22.140510 ip-10-0-131-216 kubenswrapper[2574]: I0424 15:13:22.140481 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nqxtx_cd093142-e538-4326-bb16-c7b883e26fe2/tuned/0.log"